Quarterly Financial Reports

Acts and Regulations

Statistics Canada was established to ensure that Canadians have access to a trusted source of statistics on Canada to meet their highest priority needs. Access to trusted statistical information underpins democratic societies, as it supports evidence-based decision-making in the public and private sectors, and informs debate on public policy issues.

What are we doing?

As a department, Statistics Canada is responsible for the following:

All of the government's Acts and Regulations can be found on the Justice Laws website.

For more information

To learn about upcoming or ongoing consultations on proposed federal regulations, visit the Canada Gazette and Consulting with Canadians websites.

Canadian Classification of the Functions of Government (CCOFOG) Methodology

Universe

CCOFOG data are presented for all general government sectors: the federal general government sector, the provincial general and territorial government sector, the local general government sector, the colleges and universities sector, and the health, school board and Canada and Quebec Pension Plan sectors. Canadian Classification of the Functions of Government (CCOFOG) coding is applied at the program level for the general ledger accounts, specified purpose accounts, special funds, and income statements of specific entities, such as colleges and universities.

For the province of Nova Scotia, program expenses were estimated from the provincial general government budget documents. This is similar to the methodology used in the previous Financial Management System (FMS) framework.

Data composition

The published CCOFOG data represent only expenses, with the exception of the consumption of fixed capital. They also exclude acquisitions of non-financial assets. The CCOFOG data are currently available for the period 2008 to 2012.

Data reliability

CCOFOG provisional data are being released for the first time in November 2014. This provisional qualifier signals to users that although the data are fit for use, they are subject to revisions. Over the next year these data will be integrated into the rest of the Canadian System of Macroeconomic Accounts (the National Accounts, Balance of Payments, International Investment Position, Input-Output Tables) resulting in revisions as data, concepts and methods are reconciled and aligned within the national accounts framework.

Data limitations

A CCOFOG-based analysis must be limited exclusively to a chronological analysis of a single jurisdiction in a single government sector. Transfers among jurisdictions and among government sectors are not consolidated, which means that, in practice, if the health and education sectors, for example, are added to the general provincial government sector, we are over-estimating expenses as a result of the double-counting of the value of the transfers.

Inter-provincial comparisons are also strictly impossible because the non-reconciliation of transfers means that we cannot compare a government sector that has different responsibilities in two provinces. For example, Ontario delegated a majority of its social housing responsibilities to the local government sector, but British Columbia did not. Thus, comparing CCOFOG data from division 710 – Social Protection, for the general government sector for these two provinces is statistically invalid.

Coding process

The CCOFOG classification has three levels. The highest level is referred to as the division and has 10 separate categories. The second level is referred to as the group and the lowest level is referred to as the class. In this initial CCOFOG release the data are presented at the division level and exclude amortization of non-financial assets expenditures. In November 2015, the data will be presented at the group level.

The primary mandate of a government's program, together with additional information provided by the Canadian Government Finance Statistics (CGFS) coding, is used to assign the CCOFOG classification. When a program has multiple mandates requiring multiple CCOFOG codes, available financial documents are used to determine the main proportion of the observed expense. The total value of the government's program is then assigned to that CCOFOG code.

In general, special funds usually have a single function and thus a single CCOFOG code is assigned. For example, a social housing authority would have all expenses coded to 71069 – Housing.

The assignment is always at the lowest level of CCOFOG detail, which is the class level.

General assignment principles

The 2014 Government Finance Statistics Manual, published by the International Monetary Fund, provides an overview of the COFOG assignment rules in Chapter 6 and its annex. Canada rigorously adheres to the guidelines described in the manual but has introduced certain nuances that more accurately reflect the Canadian reality. The “Detailed assignment decisions” section explains these nuances by class and/or function.

When a program significantly impacts a number of different classes in the same group, or if there wasn't enough detail, an aggregate was sometimes created. For example, aggregate 70459 – Transport n.e.c. was created to represent the sum of transport expenses that could not be specifically allocated to the Road Transport, Water Transport, Railway Transport, Air Transport and Pipelines and other transport systems classes.

Detailed assignment decisions

Division 701- General public services

Centralized services such as Access Ontario are classified under 70133 – Other general services. Services shared by certain departments, such as information technology and human resources, are deemed to be “centralized” if they cover more than two departments.

Government research institutes are generally classified under Basic research (70149); most other research institutes are assigned to applied research or experimental development in their area of expertise (health, agriculture, etc.).

All negotiations of territorial treaties with Aboriginal bands are included in class 70169 – General public services.

All expenses identified under the CGFS classification as interest expense, are classified under 70179 – Public debt transactions.

Transfers to governments for infrastructure expenses are coded under group 7018 – Transfers of a general character between different levels of government. Code 70181 was created to identify transfers to the federal government, while code 70182 identifies transfers to provincial governments and code 70183, transfers to local governments.

Division 702- Defence

Military defence is exclusively a federal government jurisdiction – these expenses will not be found at the provincial/territorial or local level.

Division 703- Public order and safety

In Canada, probation and parole monitoring programs are the responsibility of prison administrations and not the courts as recommended by the Government Finance Statistics Manual. To preserve the comparability of international data, we have left these programs under the courts, but we have set them apart by identifying them by a specific code, 70331. This code will make it easier to transfer the program when Canada publishes its public order and safety expenses under its Justice framework.

Similarly, two key Public order and safety programs in Canada also received their own unique codes: 70332 for legal aid and 70333 for administrative tribunals.

704- Economic affairs

Expenses related to status of women boards and other gender equality initiatives are included in 70412- General labour affairs because, historically, the employment component was the initial focus of these programs.

A CCOFOG group was created to integrate the expenses of programs involving immigration and citizenship, namely, 70413 – Citizenship and immigration.

As mentioned earlier, a special aggregation was created to combine transport expenses when there is not enough detail to identify a specific class: 70459 – Transport n.e.c.

705 – Environmental protection

At the local government level, it is sometimes difficult to separate water supply (70639) and waste water management (70529) expenses; in these instances, a new CCOFOG classification was created to aggregate the two types of expenses (70631).

706 – Housing and community amenities

At the local government level, it is sometimes difficult to separate water supply (70639) and waste water management (70529) expenses; in these instances, a new CCOFOG classification was created to aggregate the two types of expenses (70631).

707 – Health

708 – Recreation, culture and religion

709 – Education

The level of available detail in our source data on education expenses does not allow us to estimate pre-elementary and elementary data or the first and second cycles at the secondary level. We have therefore grouped these classes together in a new aggregated category, 70929 – Elementary and secondary education.

We are also unable at this time to separate non-doctoral higher education (70941) from doctoral (70942); we have therefore combined these two classes into a new aggregated category University education (70949).

Furthermore, when there was not sufficient detail to distinguish college education (70939) from university education (70949), the default choice was to classify this expense under university education (70949).

Internships and apprenticeships were included in division 709 – Education only when such hours were essential to obtain credits toward the degree or diploma. Otherwise, internships and apprenticeships are included in division 704 – Economic affairs under group 70412 – General labour affairs.

710 – Social protection

To accommodate the requirements of public order and safety expenses, under group 7107 – Social exclusion n.e.c. a new class was created 71071 – Victims of crime.

Figure 1 Public Sector Universe

Figure 1 Public Sector Universe
Description for Figure 1

The hierarchy of the public sector along with its subcomponents.

Public sector:

  • General governments
    • Federal general government
      • Government
        • Ministries and departments, non-autonomous funds and organizations
        • Autonomous funds and organizations
      • Federal non-autonomous pension plans
    • Social Security FundsFootnote 1
      • Canada Pension Plan
      • Quebec Pension Plan
    • Provincial and territorial general government
      • Government
        • Ministries and departments, non-autonomous funds and organizations
        • Autonomous funds and organizations
      • Provincial non-autonomous pension plans
      • Universities and colleges
      • Health and social service institutions
    • Local general government
      • Government
        • Municipalities and quasi-municipalities, non-autonomous funds and organizations
        • Autonomous funds and organizations
      • School boardsFootnote 2,Footnote 3
    • Aboriginal general government
      • Government
        • Aboriginal governments
  • Government business enterprises
    • Federal government business enterprises
    • Provincial and territorial government business enterprises
    • Local government business enterprises

Footnotes

The Canadian Health Measures Survey 2007/2008 to 2012/2013

Evaluation Report

February 2013

  • Executive summary
    • Context
    • Evaluation design and methodology
    • Key findings
  • 1. Introduction
  • 2. Canadian Health Measures Survey
    • 2.1 Background
    • 2.2 Canadian Health Measures Survey - objectives, activities and expected outcomes
    • 2.3 Governance
    • 2.4 Program resources
    • 2.5 Evaluation context, scope and issues
  • 3. Methodology
    • 3.1 Approach and design
    • 3.2 Document and literature review
    • 3.3 Administrative data review
    • 3.4 Financial data
    • 3.5 Key informant interviews and group interviews
    • 3.6 Survey of primary data users
    • 3.7 Limits of the evaluation
  • 4. Key findings
    • 4.1 Relevance
      • 4.1.1 Program responsiveness
      • 4.1.2 Alignment with the government's and Statistics Canada's priorities and obligations
    • 4.2 Performance
      • 4.2.1 Effectiveness – Achievement of outputs and expected outcomes
      • 4.2.2 Performance - Efficiency and economy
  • 5. Conclusions
    • 5.1 Relevance
    • 5.2 Performance – Effectiveness
    • 5.3 Performance – Economy and efficiency
  • 6. Recommendations
  • 7. Management response and action plan
  • Appendices
    • Appendix 1: Canadian Health Measures Survey process map
    • Appendix 2: A composite logic model for all activities under the Action Plan
    • Appendix 3: Canadian Health Measures Survey (CHMS) evaluation matrix
    • Appendix 4: A list of publications reviewed as part of the literature review

Executive summary

This report presents the results of the Canadian Health Measures Survey (CHMS) evaluation, which was conducted from April to October 2012. It provides credible and neutral information on the ongoing relevance and performance of Statistics Canada's CHMS activities. The scope of the evaluation is the time period from 2007/2008 to 2012/2013. This evaluation examined the five Treasury Board Secretariat core issues, in accordance with the Policy on Evaluation.

Context

CHMS is Canada's first nationally representative direct health measures survey. The survey was launched in 2007 to address long-standing limitations and data gaps within Canada's health information system. The principal objective of CHMS is to collect new and important data on Canadians' health status by

  • providing a platform and infrastructure for obtaining data and information through physical and laboratory measures to meet the emerging needs of several branches within Health Canada (HC) and the Public Health Agency of Canada (PHAC), as well as other add-on studies
  • collecting and disseminating direct health measures data, including those on environmental contaminants
  • promoting research using direct health measures data by providing access to nationally representative data.

Evaluation design and methodology

The evaluation was conducted in accordance with the Treasury Board Secretariat (TBS) Standards on Evaluation for the Government of Canada. Evaluation design, effort and resources were calibrated while taking into consideration the roll-up horizontal evaluation that would use the findings of this study as an input. To assess CHMS, evaluation issues and questions from the current Horizontal Evaluation Matrix for the Action Plan were slightly updated to better reflect the evaluation questions and indicators applicable to Statistics Canada.

The approach used to evaluate CHMS can be defined as theory-driven and based on a non-experimental design using post-collected information. Findings are collected from multiple lines of evidence: document and literature reviews, administrative and financial data reviews, a series of key informant interviews, and a survey of primary data users.

The evaluation efforts ensure an appropriate balance between the level of effort and the context. Nevertheless, because of contextual elements and resource constraints, there are a number of limitations associated with this evaluation:

  • Timing of the evaluation – The findings are limited by how complete the CHMS cycles are (only Cycle 1 was fully complete at the time of the evaluation).
  • Measurement of outcomes – There was a limited availability of performance data systematically collected on an ongoing basis according to the performance measurement framework for the Action Plan developed in 2009.
  • Interviews – A limited number of key informant interviews were conducted; however, this limitation can be mitigated during the roll-up evaluation if the leading team at HC finds it necessary.
  • Document and data review – Some challenges were faced when assessing the efficiency and economy of the CHMS program. The practice of compiling financial data by fiscal year and not by cycle, the lack of baseline data on cost and resources used by level of output produced, and the lack of data on similar surveys were some of the factors that affected the completeness and the conclusiveness of the evaluation evidence.

Key findings

Relevance

CHMS represents a unique program in Canada, providing direct health measures data to support health research, policy, and decision making. Evidence shows that the program is relevant to Canadians and health organizations, with a clear present and future need for the program. Although most feel that CHMS responds to current and emerging content needs, there was no content plan ever developed for CHMS.

CHMS is well aligned with the priorities of the federal government and Statistics Canada. The federal government is in the best position to deliver CHMS, and has specific legislative obligations being met through the program, making the delivery of the program a legitimate, appropriate and necessary activity.

Performance - Effectiveness

CHMS outputs have been produced as expected and the expected immediate outcome is being achieved: reliable and usable data are available on the baseline health status of Canadians, and on the level of, and exposure to environmental contamination. Some issues exist concerning the accessibility of the data, which could impair the long-term outcomes if not addressed. More specifically, even though most researchers are aware of the data, many are unaware of how to access them. This view is corroborated by external interviewees, who believe that the lack of—or complexity of—accessibility is a barrier to the use of the data. Some key informants commented on the unavailability of microdata files for CHMS on the Internet, and on how researchers must rely on the research data centres (RDCs) to use CHMS data files. Internal and external researchers raised various concerns related to RDCs, such as a lack of awareness of how to access the data, and timelines to get proposals approved (sometimes up to six months).

Evaluation findings demonstrated evidence of awareness of CHMS data among the general public, researchers and data users as a result of publications based on Cycle 1 data. However, issues, such as the 'reach' of publications, the lack of promotion of CHMS data to scientific communities, and the timing of this evaluation were identified as limitations to awareness. Researchers felt that there needs to be better communication and better collaboration to avoid duplicating research and analysis should different research groups look at the same data.

The evaluation findings provided evidence that CHMS data are being used for research, policy, and decision making—the expected long-term outcome. Researchers have already used CHMS data to validate self-reported data from other surveys, as well as to do research for scientific discovery. In addition, CHMS data are used for policy and decision making in the areas of physical activity, environmental exposure, nutrition markers, and oral health. However, it is still too early in the program's life cycle to broadly demonstrate CHMS's full potential. Information on publications (research studies) are only tracked internally and not shared with external stakeholders, making the collaboration and reporting, in some instances, more difficult.

Performance – Economy and efficiency

The shift from predominantly cost recovery to core funding of CHMS, as a result of the 2008 Action Plan, helped stabilize the survey, which had a positive impact on CHMS. In particular, it allowed for better long-term planning and for seeking opportunities to increase operational efficiency. The evaluation results demonstrated sufficient human and financial resources to support the program. Some external parties have indicated that increased efficiency and economy could be achieved through more flexible hiring practices and through the use of local clinics and infrastructure as an alternative to the mobile clinics. While these options might be worth further consideration, their feasibility must be assessed against the extent to which they could impair the quality of data collection.

Furthermore, the use of existing CHMS infrastructure by other federal partners allows them to achieve their objectives in an efficient way while reducing time and costs. Sharing complementary knowledge between Statistics Canada, federal partners and academia increases the analytical capacity in the health domain.

CHMS is the only nationally representative direct measures survey in Canada and complements other Canadian and international studies. Therefore, there is no evidence of duplication of efforts that might influence efficiency.

1. Introduction

The purpose of this study is to evaluate the Canadian Health Measures Survey (CHMS) as part of the Action Plan to Protect Human Health from Environmental Contaminants. The Action Plan is a horizontal initiative between Health Canada (HC), the Public Health Agency of Canada (PHAC) and Statistics Canada that was launched in 2008/2009. HC is the lead department for the horizontal evaluation, which is due for completion by March 2013.

This evaluation was conducted as prescribed by the Departmental Risk-based Audit and Evaluation Plan for 2012/2013 to 2016/2017, approved by the Departmental Evaluation Committee on March 27, 2012. The project was managed and carried out by Evaluation and Professional Practices Division, Audit and Evaluation Branch.

2. Canadian Health Measures Survey

2.1 Background

CHMS is Canada's first nationally representative direct health measures survey. Planning and development for CHMS started in 2003 as part of an extension of the Health Information Roadmap Initiative and in response to the need for a national, comprehensive source of accurate direct health data, which was expressed by policy makers, provincial health departments, researchers and health professionals from many fields. The survey was launched in 2007 to address long-standing limitations and data gaps within Canada's health information system.

To assist Canadians in reducing their health risks resulting from environmental contaminants and to further develop environmental health monitoring and surveillance, the federal government introduced the Action Plan to Protect Human Health from Environmental Contaminants (the 'Action Plan') in 2008. The Action Plan's long-term objective is to reduce health risks to Canadians, particularly vulnerable populations, from harmful environmental contaminants. To accomplish this objective, the Action Plan has two basic components: the first is the "Environmental Health Guide" and the second is monitoring and surveillance. Statistics Canada's CHMS falls under the monitoring and surveillance component. This component consists of a series of surveys and surveillance activities, with the objective of providing Canadians with a better understanding of environmental exposure and potential related health risks.

2.2 Canadian Health Measures Survey - objectives, activities and expected outcomes

CHMS collects key health information on Canadians through direct health measurements such as blood pressure, height, weight, and blood and urine samples. In addition, CHMS uses household interviews to gather information for other variables, including nutrition, smoking habits, alcohol use, medical history, current health status, sexual behaviour, lifestyle and physical activity, environmental and housing characteristics, as well as demographic and socioeconomic variables.

Objectives

CHMS has an important role in supporting the broader health agenda of the federal government.

The principal objective of the CHMS is to collect new and important data on Canadians' health status by

  • providing a platform and infrastructure for obtaining data and information through physical and laboratory measures to meet the emerging needs of several branches within HC and PHAC, as well as other add-on studies
  • collecting and disseminating direct health measures data, including those on environmental contaminants
  • promoting research using direct health measures data by providing access to nationally representative data.

Activities

CHMS is designed and implemented in consecutive cycles, each of which consists of three activities: planning, collection and dissemination. Table 1 shows the CHMS life cycle. The CHMS process map is presented in Appendix 1.

Table 1 Canadian Health Measures Survey main activities and related tasks, cycles 1 to 3
  Cycle 1 Cycle 2 Cycle 3
Planning Survey development 1. Identify survey content From 2003/2004 to 2007/2008 Jan. 2008 to Dec. 2009 Fall 2009 to Dec. 2011
2. Develop survey content
3. Develop software systems and tools
4. Develop protocols
Training 5. Train interviewers, health measuring specialists and clinic laboratory technicians
Data collection 1. Conduct interviews at home March 2007 to Feb. 2009 Sept. 2009 to Dec. 2011 Jan. 2012 to Dec. 2013
2. Conduct visits to the mobile clinic
3. Complete lab tests and return results to Statistics Canada March 2007 to June 2009 Sept. 2009 to April 2012 Jan. 2012 to April 2014
4. Store biospecimens in the Biobank March 2007 to Feb. 2009 Sept. 2009 to Dec. 2011 Jan. 2012 to Dec. 2013
Dissemination 1. Perform data processing and post-collection March 2007 to June 2009 Sept. 2009 to April 2012 Jan. 2012 to April 2014
2. Analyze and disseminate results Jan. 2010 to April 2011 Sept. 2012 to April 2013 To come
Infrastructure (enabling function) 1. Methodology Ongoing Ongoing Ongoing
2. System development
3. Other

Expected outcomes

The activities and outputs of CHMS are expected to lead to the following expected outcomes as specified in the Roll-up Evaluation Framework for the Action Plan to Protect Human Health from Environmental Contaminants.

Immediate outcome (2 to 5 years):

Reliable and usable data for decision makers, researchers and Canadians on the baseline health status of Canadians, and the level of, and exposure to environmental contaminants.

Intermediate outcome (2 to 5 years):

Increased awareness among data/ information users of the collected data/ information.

Long-term outcome (5 to 10 years):

Decision makers increasingly use the information on the associations between contaminants and illness to guide decision-making in public health practice, research, policy regulation and programs & services.

Leading to the ultimate outcome (10 to 20 years):

To reduce the health risks to Canadians, particularly vulnerable populations, from harmful environmental contaminants.

2.3 Governance

Statistics Canada's Policy Committee, headed by the Chief Statistician and supported by the agency's assistant chief statisticians, is responsible for the administration of CHMS under the Statistics Act. In addition to the agency-level approvals and processes, the Canadian Population Health Statistics Program (CPHSP) Committee is responsible for ensuring that the Population Health Program at Statistics Canada responds to the health needs of all Canadians. CPHSP comprises assistant deputy ministers from HC, PHAC, and Statistics Canada.

HC and PHAC's joint Research Ethics Board (REB) Footnote 1 helps ensure that CHMS meets the highest ethical standards, and that the greatest protection is provided to participants who serve as research subjects. It is guided by the principles of the Tri-Council Policy Statement, Ethical Conduct for Research Involving Humans, Footnote 2 which sets the standard for research ethics boards in Canada. The REB reviews and approves the procedures and general conduct of the survey prior to the start of each CHMS cycle.

A privacy impact assessment is presented to the federal Privacy Commissioner for every cycle of the survey in compliance with the Privacy Act. Further, provincial privacy commissioners are advised of the survey collection activities within their jurisdiction and steps are taken to protect the privacy of survey participants according to applicable provincial law.

Since the inception of CHMS, four advisory committees have provided expertise and advice on various aspects of the survey: the Expert Advisory Committee, the Physicians Advisory Committee, Laboratory Advisory Committee, and Quality Assurance and Quality Control Advisory Committee.

In June 2012, the Biobank Committee was created. The Biobank Committee is chaired by a director general and consists of four members internal to the federal government and four external members (e.g., academics specializing in genetics or environmental contaminants). It is responsible for granting access to Biobank specimens, which are part of the CHMS data. CHMS management has recently taken the initiative to re-align the structure of the advisory committees with respect to the current, more mature stage of the survey.

2.4 Program resources

The initial funding for CHMS was allocated in the 2003 federal budget as part of an extension of the Health Information Roadmap Initiative. In 2007/2008, CHMS was funded $975,860 through an agreement between Statistics Canada and HC. This funding supported the completion of the CHMS planning activity for Cycle 1, prior to the launch of the Action Plan in 2008/2009.

Under the Action Plan, the total investment ($84.6 million over a five-year period) is divided between HC, PHAC and Statistics Canada, as presented in Chart 1.

Chart 1 Federal partners in the Action Plan

Bar graph displaying resources allocated over the years
Description: Chart 1 Federal partners in the Action Plan
  • The title of the graph is "Chart 1 Federal partners in the Action Plan."
  • This is a column clustered chart.
  • There are in total 6 categories in the horizontal axis. The vertical axis starts at 0 and ends at 60,000 with ticks every 10,000 points.
  • There are 3 series in this graph.
  • The vertical axis is "Resource allocation ($ thousands)."
  • The units of the horizontal axis are years from 2008 to 2013.
  • The title of series 1 is "Statistics Canada ."
  • The minimum value is 1,500 occurring in 2008.
  • The maximum value is 54,500 occurring in 2013.
  • The title of series 2 is "Health Canada ."
  • The minimum value is 1,500 occurring in 2012.
  • The maximum value is 18,700 occurring in 2013.
  • The title of series 3 is "Public Health Agency of Canada ."
  • The minimum value is 600 occurring in 2008.
  • The maximum value is 11,400 occurring in 2013.
Data for Chart 1: Federal partners in the Action Plan, Resource allocation
($ thousands)
Year Statistics Canada Health Canada Public Health Agency of Canada
2008 1,500 3,800 600
2009 11,000 6,200 1,400
2010 14,000 3,700 2,600
2011 14,000 3,500 3,400
2012 14,000 1,500 3,400
2013 54,500 18,700 11,400
Fiscal year
Source(s):
Treasury Board submission.

The source of funds attributed to Statistics Canada equals $54.5 million over a five-year period, which represents 64% of the total allocation of funds under the Action Plan. Based on this funding, CHMS was expanded to include additional measures on environmental exposure as well as direct health measures for children under the age of six.

The CHMS program received total core funding of $47.7 million for the period covered from 2008/2009 to 2012/2013. A total of $48.7 million has been allocated to CHMS from 2007/2008 to 2012/2013, inclusively. Footnote 3

CHMS also received external funds from HC, PHAC and other entities to support special requests for additional laboratory tests and analysis that were not planned in the core funding. From 2007/2008 to 2012/2013, a total of $37.6 million was allocated as external cost-recoveries.

Therefore, the total CHMS resource allocation under the scope of this evaluation (i.e., core funding plus external cost-recoveries) represents $86.3 million over a five-year period and is detailed in Chart 2.

Chart 2 Total Canadian Health Measures Survey resource allocations (core and external cost recovery), from 2007/2008 to 2012/2013

Bar graph displaying resources allocated over the years
Description: Chart 2 Total Canadian Health Measures Survey resource allocations (core and external cost recovery), from 2007/2008 to 2012/2013
  • The title of the graph is "Chart 2 Total Canadian Health Measures Survey resource allocations (core and external cost recovery), from 2007/2008 to 2012/2013."
  • This is a column clustered chart.
  • There are in total 6 categories in the horizontal axis. The vertical axis starts at 0 and ends at 14,000 with ticks every 2,000 points.
  • There are 2 series in this graph.
  • The vertical axis is "Resource allocation($ thousands)."
  • The units of the horizontal axis are years from 2007 to 2012.
  • The title of series 1 is "Core funding."
  • The minimum value is 975.86 occurring in 2007.
  • The maximum value is 12,234.559 occurring in 2010.
  • The title of series 2 is "External cost-recovery funding."
  • The minimum value is 3,720.539 occurring in 2011.
  • The maximum value is 11,213.173 occurring in 2008.
Data for Chart 2 Total Canadian Health Measures Survey resource allocations (core and external cost recovery), from 2007/2008 to 2012/2013, Resource allocation
($ thousands)
Year Core funding External cost-recovery funding
2007 975.86 9,455.31
2008 1,251.177 11,213.173
2009 9,841.346 4,163.594
2010 12,234.559 4,681.723
2011 12,201.489 3,720.539
2012 12,165.705 4,403.89

2.5 Evaluation context, scope and issues

This report presents the results of the CHMS evaluation, which was conducted from April to October 2012. It provides credible and neutral information on the ongoing relevance and performance of Statistics Canada's CHMS activities outlined in the Action Plan, as well as necessary performance measurement information to support the horizontal evaluation. It also informs Statistics Canada's senior management on the overall performance of CHMS. The scope of the evaluation is from 2007/2008 to 2012/2013. It was expanded one year beyond the timeframe of the horizontal evaluation (2008/2009 to 2012/2013) to capture CMHS's Cycle 1, launched in 2007, because this was the only cycle fully completed at the time of the evaluation. This evaluation assessed the relevance and performance of CHMS by examining the five Treasury Board Secretariat core issues, in accordance with the Policy on Evaluation.

Table 2 Evaluation issues and questions

Relevance

  1. Is there a continued need for the CHMS initiative and is it responsive to the needs of Canadians? (Treasury Board Secretariat (TBS) issue 1)
  2. Is the CHMS program consistent with government priorities and Statistics Canada's strategic outcome? (TBS issue 2)
  3. Is there a legitimate, appropriate and necessary role for the federal government and for Statistics Canada in implementing the CHMS program? (TBS issue 3)

Achievement of expected outputs

  1. Were CHMS' design and implementation processes appropriate to produce the expected activities and outputs?

Performance – Effectiveness (TBS issue 4)

  1. To what extent has CHMS achieved its expected outcomes?
  2. Were there any factors (internal or external) that contributed or detracted from the achievement of those expected outcomes?
  3. Have there been any unintended impacts (positive or negative) from CHMS?

Performance – Efficiency and economy (TBS issue 5)

  1. Are sufficient human and financial resources in place to support the implementation of CHMS?
  2. What are the efficiencies that are being gained as a result of the Action Plan?
  3. Are there any duplication, complementarities and possible alternatives?

3. Methodology

The evaluation was conducted in accordance with the TBS Standards on Evaluation for the Government of Canada. To assess CHMS, evaluation issues and questions from the current Horizontal Evaluation Matrix for the Action Plan were slightly updated to better reflect the evaluation questions and indicators applicable to Statistics Canada. The CHMS Evaluation Matrix is presented in Appendix 3.

3.1 Approach and design

The approach used to evaluate CHMS can be defined as theory-driven and based on a non-experimental design using post-information for most of the evaluation questions. Findings are also collected from multiple lines of evidence. That is, more than one method was used to measure each of the evaluation indicators, thereby strengthening the validity of the findings. To that end, five methods (including qualitative and quantitative approaches) were selected to gather evaluation data:

  • a review of relevant documents and literature
  • a review of administrative data
  • a review of financial data
  • a series of key informant interviews
  • a survey of primary data users.

Each line of evidence used for the evaluation is described in the coming sub-sections. Calibration Footnote 4 played an important role in this specific evaluation, which will serve as an input into the horizontal evaluation of the Action Plan being conducted by HC.

3.2 Document and literature review

A review of program documentation provided a thorough understanding of CHMS and contributed to the design of the methodology for this evaluation, including the refinement of the data-collection instruments. The information gathered from document and literature reviews provided useful context for interpreting, confirming and supplementing information gathered through the other methods.

Because of time and budget constraints, the number of external documents reviewed was limited. Documents reviewed were, for the most part, based on publications in Health Reports, Statistics Canada's website and official documents, as well as external publications from other federal government departments.

A review of some published literature was undertaken to gather data to respond to evaluation questions, such as program relevance and lessons learned by other agencies delivering similar programs. A list of publications reviewed is provided in Appendix 4.

3.3 Administrative data review

The administrative data review was based on internal files, documents and records provided by the program area, as well as the minutes from the evaluation team meetings held from April to June 2012.

Administrative data on performance measures (e.g., the number of website hits, research data centre [RDC] requests for CHMS data, and client service requests for custom tabulation) were used to evaluate the availability and awareness of CHMS data among decision makers, researchers and Canadians.

3.4 Financial data

Budget and full-time equivalent (FTE) data were taken directly from the Financial Management System (FMS) to assess resource use in relation to the production of outputs and progress toward expected outcomes. Actual expenditures for CHMS core and cost-recoveries were also taken from FMS.

Capital asset and depreciation information was taken into consideration for financial analysis. This information was collected from a list of capital assets and depreciation provided by Financial Planning Division and was updated by the program area as of July 2012.

3.5 Key informant interviews and group interviews

Key informant interviews provided data on the perceptions and opinions of individuals (e.g., program managers and decision makers, as well as primary data users and researchers, both internal and external to the Government of Canada). The individuals have played a significant role, or had extensive experience with CHMS, as specified by the Roll-Up Evaluation Framework for the Action Plan to Protect Human Health from Environmental Contaminants.

Interview guides were designed to address all of the evaluation questions, and adapted to each audience, as required. Names of potential key informants for each group were identified through internal consultation. The method used to select key informants varied by group type. It was either a census or a sample based on the informant's number of years of experience and exposure to the data, and the informant's number of articles published using CHMS data or representation of a subject-matter area (e.g., environmental contaminants, tobacco, and infectious diseases). A total of 24 key informants were interviewed during June and July 2012. Table 3 indicates the planned versus actual interviews by informant group type.

Table 3 Planned versus actual interviews and group interviews
Group type Specifics Selection method Number of planned interviews or group interviews Number of actual interviews
Program managers Strategic program managers Census One group interview (two participants) One group interview (two participants)
Analytical program managers Census One group interview (three participants) One group interview (three participants)
Operational program managers Census One group interview (two participants) Two individual interviews
Chiefs Census One group interview, if time permitted (four participants) No group interviews
Decision makers (partners / cost-recovery clients) Cost-recovery clients and researchers Sample Three interviews Two interviews
One group interview with government researchers (four participants)
Primary data users and researchers Internal Statistics Canada researchers Census One group interview (six participants) Two group interviews (six participants in total)
External researchers and Expert Advisory Committee members Sample Five interviews Five interviews

3.6 Survey of primary data users

A list of primary CHMS data users was compiled based on the list of researchers who had access to the shared CHMS data files at HC and PHAC, and the list of researchers who had requested access to the data through RDCs. Footnote 5 RDC requests include researchers in other federal government departments and researchers external to the federal government. The 134 primary data users who were not selected for an interview received a survey questionnaire via email to complete either in June or July 2012. Of them, 24% were researchers from HC, 41% from PHAC and 35% were RDC researchers.

A draft survey questionnaire was developed to reflect the methodology embedded in the evaluation matrix. It was tested with two internal researchers via one-on-one interviews. As a result of the testing, the suggested corrections were implemented and the updated questionnaire was tested again with one external researcher via telephone interview, before being finalized.

The survey was in the field for two weeks. One reminder email was sent to non-respondents after the first week. In all, 30 people responded to the survey, which represents a 22% response rate. Among responses received, 26.7% were returned from HC (this represents 25% of their population), 36.7% were returned from PHAC (20% of their population), and 36.7% were returned from RDC researchers (23.4% of their population).

3.7 Limits of the evaluation

To the extent possible within the available budget and timeframe, the evaluation methodology incorporated multiple methods of data collection from different primary and secondary sources to ensure that the findings were valid and that they captured key points of view on CHMS. Whenever possible, the opinions and observations expressed by stakeholders were corroborated with evidence from program documentation and data.

However, because of contextual elements and resource constraints, there are a number of limitations associated with this evaluation. As specified earlier in the report, calibration played a crucial role (given that this is a part of the horizontal evaluation) and scope was defined accordingly to ensure an appropriate balance between level of effort and context. The following limitations should be taken into account when reviewing the findings from this report:

Timing of the evaluation – The timing of the evaluation limits the capacity to report on the program's outcomes because only Cycle 1 of CHMS had been completed. CHMS releases for Cycle 2 will be available by the end of September 2012, and Cycle 3 is collecting data. Therefore, the findings are limited by the degree of completeness of these cycles.

Measurement of outcomes – The budget for the evaluation along with the limited data available on expected outcomes limited the ability to report on the success of CHMS. Although a very specific performance measurement framework was outlined in the Action Plan for Statistics Canada, there is limited evidence that performance data were systematically collected on an ongoing basis according to this framework.

Interviews – A limited number of key informant interviews were conducted. Moreover, interviews with key informants and stakeholders did not include any independent respondents with no stake in the program. Therefore, there is a possibility that interview respondents had a positive bias toward the program. HC will have the opportunity to interview additional key informants during the roll-up evaluation.

Document and data review – Some challenges were faced when assessing the efficiency and economy of the CHMS program. Some of the factors that impacted the completeness and the conclusiveness of the evaluation evidence were the practice of compiling financial data by fiscal year and not by cycle, the lack of baseline data on cost and resources used by level output produced, and the lack of data on similar surveys.

Key findings

4.1 Relevance

4.1.1 Program responsiveness

Summary of Findings:

Evidence from program documentation, the literature and interviews points to the ongoing relevance of CHMS as well as to the continued need for the program. Moreover, CHMS is seen as being flexible enough to address emerging needs.

There is evidence that national baselines for environmental exposure have been established as a result of CHMS data, a first step toward identifying trends and consequent links between environmental contaminants and health in the future. Since long-term tracking is required to establish trends and to address potential implications, interventions, controls, etc., clear evidence related to outcomes needs to be collected overtime.

4.1.1.1 Continued need

According to documents reviewed, CHMS stems from a need to fill gaps in health information. It was developed to fulfill specific important needs, such as the need for

  • national baseline data on the extent of major health concerns (e.g., obesity, hypertension, vitamin and nutrition deficiencies, chronic and infectious diseases, oral health status of Canadians and the level of access to dental care)
  • national data on exposure and prevalence levels of environmental chemicals in the population
  • children's health information.

The principal objective of CHMS is to collect data on Canadians' health status by direct measures. While Statistics Canada has been collecting health status and related data for many years by other means (i.e., self-reported surveys and administrative data), these data were seen as generally limited in two important ways:

  1. Many kinds of data, such as blood pressure and physical fitness, simply cannot be ascertained in an interview; they require direct physical measurement.
  2. Health information derived from self-reporting or administrative records may be seriously biased for some variables. For example, one review Footnote 6 shows a consistent reporting bias of even simple measures like height and weight. This bias has the potential to misinform data users. Footnote 7, Footnote 8

"An incredible service to researchers is the fact that CHMS data provide a nationally representative "normal" sample across a broad age range, in both sexes that can be compared to special interest groups in research studies."

External data user

According to the health scientists and researchers interviewed, CHMS data are needed to validate and adjust self-reported data to increase research accuracy, since these data are based on evidence and are nationally representative.

"CHMS is used primarily as a surveillance mechanism to know the health of the nation, and in this role it provides the best data on physical activity, fitness, obesity, blood pressure, cardiovascular risk factors, etc. As much as it is important for us to know the extent of these health issues in Canada and how they change over time to help direct policies and priorities, the same importance is attributed to CHMS (being the only source of data we have on these things in Canada)."

External data user

Indeed, according to the Statistics Canada paper "Canadian Health Measures Survey: Rationale, background and overview" in the Health Reports special issue: Background Papers on the Canadian Health Measures Survey, efforts to correct for self-reporting bias are complicated by the probability that bias for some measures may be unstable over time and susceptible to media attention and social marketing campaigns, among other influences. Direct health measurements provide more robust, objective measures and allow for the assessment of variables that simply cannot be determined accurately through self-reporting (e.g., metabolic syndrome, environmental toxin exposure, lung function). Such data are needed for public health education, health promotion programs, health care planning, health surveillance and research.

In Canada, a number of program and policy initiatives that occurred from 2001 to 2006 further led to the need for surveillance and monitoring of public health indicators, thus providing a direct or indirect impetus for the creation and ongoing support of a direct health measures survey. Table 4 exhibits examples of such initiatives.

Table 4 Examples of initiatives that led to the increased need for a direct physical measures survey
Initiative Description
Chronic Disease Prevention Alliance of Canada (2001) Advocacy for integrated research, surveillance, policies and programs, and the resources needed to positively influence the determinants of health and reduce incidence of the chronic diseases that account for the largest burden of morbidity, mortality and cost in Canada, namely, cardiovascular disease, diabetes and cancer. Footnote 9
Building on Values: The Future of Health Care in Canada (Romanow Report) (2002) Report of the Commission on the Future of Health Care in Canada, which reported on consultations with Canadians on the future of Canada's public health care system and recommended policies and measures that offer quality services to Canadians and strike an appropriate balance between investments in prevention and health maintenance and those directed at care and treatment. Footnote 10
Review of Human Biomonitoring Studies of Environmental Contaminants in Canada 1990-2005 (2006) Provided strong evidence of a need for more comprehensive and intensive biomonitoring of environmental contaminants in Canada. Footnote 11

The need for a direct health measures survey is not unique to Canada. Several countries have a history of conducting surveys that include direct health measures and that have yielded important findings, which validates the need for this type of survey. For example, the U.S. National Health and Nutrition Examination Survey (NHANES), Footnote 12 conducted since the early 1960s, Footnote 13 has provided data to construct standard growth charts for children, thereby allowing doctors and parents to better understand developmental health trajectories. Footnote 14 In the 1960s, NHANES confirmed findings linking high cholesterol and heart disease. It also provided the first evidence that Americans had high lead levels in their blood, which motivated governments to phase out the use of lead as an additive in gasoline and paint. Footnote 15 In Australia, a direct health measures survey conducted from 1999 to 2001 found that for every known case of diabetes, there was one undiagnosed case, and that nearly one million Australians over age 25 have diabetes. Footnote 16 Finland, too, has a legacy of important public health and scientific findings from national direct health measures surveys. Footnote 17,Footnote 18,Footnote 19

In addition, the findings from the survey of CHMS primary data users also demonstrated and confirmed the current need for CHMS data for many purposes, such as trend analysis, monitoring and tracking changes, providing objective physical measures and lab data, providing a nationally representative sample, research, health impact assessments, diagnoses, prevention, informing policy, and policy development support. All survey respondents confirmed that there was a need to continue to collect and provide CHMS data on an ongoing basis.

4.1.1.2 Inclusion of environmental exposures in the CHMS scope

A growing concern of Canadians about environmental contaminants has been observed in the past decade. For example, a 2006 EKOS Research Associates Inc. survey showed that 89% of Canadians believe the health of their children is being adversely affected by environmental pollution. Nearly one in four Canadians has sought medical treatment for a condition that they believed to be related to the environment.

Two major federal government initiatives, the 2006 Chemicals Management Plan and the 2007 Clean Air Agenda committed to address issues related to environmental contaminants. The need for CHMS data is specifically relevant as the steps already taken by the government concerning these initiatives rely on CHMS to collect environmental contaminant measures. Both industry and non-governmental stakeholders supported these initiatives. They continue to insist that decisions be made on the basis of scientific evidence, and that Canadians have credible information on the impact of chemicals in the environment and the steps that they should take as a result.

A review of CHMS content demonstrates that the survey does address emerging environmental issues. Furthermore, it is evident that children under the age of six are represented in the sample. Table 5 provides an overview of the scope of the environmental exposure variables in the first three cycles of the survey. These data, according to a cost-recovery client, provide a baseline and the huge potential for identifying links between environmental contaminants and health; however, long-term tracking is required to establish such trends (some take up to 20 years to establish) and to address potential implications, interventions, controls, etc. Consequently, at this time such links are not possible to identify.

Table 5 Canadian Health Measures Survey environmental exposures: Number of variables measured, per cycle, per category and source
Description of the measures, per category and source Number of measures collected
Cycle 1 Cycle 2 Cycle 3
Variables measured at households
(fluoride; volatile organic compounds – common fuel pollutants [BTEX], trihalomethanes)
In tap water Not in scope Not in scope 11
Variables measured as part of the mobile examination clinic collection
(Aldehydes; aliphatic alcohols; aliphatic hydrocarbons; aliphatic ketones; aromatic compounds; chlorinated hydrocarbons; chlorbenzene derivatives; ethers; siloxanes; terpenes; others)
In indoor air Not in scope 102 81
Lab measures from human specimen
(Volatile organic compounds; acrylamine; benzene metabolites; carbamate insectides; chlorophenols; metals and trace elements; organophosphate metabolites; parabens; perfluorinated compounds; phthalate metabolites; polyaromatic hydrocarbons; pesticides; tobacco; herbicides)
Number of measures from human specimens
Blood 25 12 21
Urine 51 89 46
Total 76 101 67
Of them, measures taken from children under the age of 6
Blood 0 12 8
Urine 1 80 38
Total 1 92 46
4.1.1.3 Addressing current and emerging needs

When asked about emerging needs or information gaps that CHMS is expected to address, key informants and several documents identified a number of emerging needs that can be related either to new content themes or to the enhancement of the analytical power through the data. A summary of the most common suggestions collected are presented below.

New or enhanced content

  • Data on non-insured components of health (e.g., oral health, hearing, eyesight)
  • More comprehensive information on dietary and nutrition intake
  • Medication and dietary supplement intake
  • Bone health
  • Lung function
  • Noise exposure, expanded range of environmental exposure (more than 23,000 chemicals that need to be assessed and regulated)
  • Genetic testing, genetic links, personalized medicine, genome
  • Sleep
  • Specific measures (e.g., oral glucose tolerance test, cortisol [stress hormone])

Analytical capacity requirements Footnote 20

  • Record linkages
  • Longitudinal data
  • Geographic information
  • Data on sub-national level (e.g., regional, provincial, specific social groups, specific age groups, vulnerable populations)

Most key informants interviewed and data users surveyed feel that CHMS responds to current needs. As a result of CHMS data, Canada has objective, nationally representative baselines on important health measures, which have the potential to be used by doctors, health professionals Footnote 21 and researchers Footnote 22 as reference for the health characteristics of Canadians.

"It is a period [of eight years] that is not too far in the future to keep us too static, but also is not too small, to ensure some continuity and stability. It (the survey) will have core content that runs across the entire period; there will be rotating content that is going in and out on shorter time periods within these eight years, based on priority needs; and there will be a small capacity free so that we can, on a quick time line, add new content to address urgent emerging needs."

Internal interview

Further, CHMS is seen by interviewees as being able to address emerging needs, in terms of new content, because:

  • It provides an infrastructure that is comprehensive in scope, and a system of collecting direct measures that is diverse and flexible to accommodate the information needs of important health issues that may emerge in the future.
  • The content priorities are determined through the federal governance structure, following broad consultations with federal partners and other stakeholders; consequently, CHMS adjusts the survey content to be responsive to the needs indentified by them.

Although most feel that CHMS is responsive to current and emerging content needs, according to interviewees and documents reviewed, there was no content plan ever developed for CHMS to ensure the survey remains relevant and can continue to address emerging content needs.

With respect to the Biobank, Footnote 23 several researchers mentioned its significance, importance and usefulness, highlighting that it is the first Canadian nationally representative repository of biosamples. Internal and external researchers testify on the relevance of the Biobank, and that it provides huge potential for use in future research or surveillance of health issues, when a need arises.

Some analytical limits of CHMS persist because of the sample size. There are many examples where key informants specify that they need larger sample sizes for conducting more complex research and testify that this limitation impacts the use of the data for their own purposes. This aspect is presented in greater detail in the section on performance (validity, reliability and access).

4.1.2 Alignment with the government's and Statistics Canada's priorities and obligations

Summary of Findings:

Evidence from program documentation, the literature, and interviews all suggest that the CHMS objectives, under the Action Plan, are consistent with federal government priorities, and are aligned with Statistics Canada's outcomes.

Findings demonstrated that CHMS supports the federal government in fulfilling its legislative obligations, international commitments and its responsibility to protect the health of Canadians. Since these priorities are ongoing, there is still a legitimate, appropriate and necessary role for the federal government to continue CHMS.

4.1.2.1 Alignment with the government's priorities and Statistics Canada's strategic outcome

Government priorities

Collecting health data has been underscored as a federal priority on several occasions. To recall, the initial funding for the CHMS survey was outlined in the 2003 Federal Budget, where the Government of Canada highlighted the importance of "the availability of accurate and timely information on trends in health status and health system performance as a crucial tool to inform responsive, patient-centered health policy decisions." Moreover, investing in health and safety to provide Canadians with better information on the links between pollution and illness was reaffirmed in the 2008 Federal Budget.

The alignment of CHMS with federal government priorities is ensured by its governance at the strategic level. CHMS is one of several Statistics Canada health information activities under the CPHSP and, as such, its strategic priorities and decisions are determined by the CPHSP tri-partite steering committee representing HC, PHAC and Statistics Canada. By its mandate, the CPHSP committee is responsible for ensuring that federal government health information needs are addressed.

Document reviews revealed evidence that CHMS supports the federal government in fulfilling its legislative obligations, its international commitments and its responsibility to protect the health of Canadians. For example:

  • CHMS data inform and support the federal government in its ongoing monitoring, surveillance, risk management and regulating activities, which are required by multiple legislative acts such as the Health Canada Act, the Canadian Environmental Protection Act, the Pest Control Products Act, the Food and Drugs Act, and the Hazardous Products Act.
  • CHMS information helps to address the federal government's international commitments and obligations, such as the North American Free Trade Agreement, the Canada–U.S. Great LakesWater Quality Agreement, United Nations negotiations on the Global Treaty for Mercury, the Stockholm Convention on Persistent Organic Pollutants.
  • CHMS data inform the creation and updating of many federal government standards and guidelines related to health and environment (such as Guidelines for Canadian Drinking Water and the Federal Tobacco Control Strategy). The survey also provides essential information in support of several broader federal government initiatives, such as the Chemicals Management Plan, the Health Information Roadmap Initiative, and the Clean Air Agenda.

Statistics Canada's strategic outcome

Supporting CHMS and other projects that provide statistical information and analysis about the state of Canadians' health is set out as one of Statistics Canada's activities in the 2011-2012 Report on Plans and Priorities. The CHMS also constitutes one of the programs designed to achieve Statistics Canada's strategic outcome as defined in the agency's 2011 Program Activity Architecture:

Canadians have access to timely, relevant and quality statistical information on Canada's changing economy and society for informed debate, research and decision making on social and economic issues.

In summary, the work of Statistics Canada to conduct CHMS is consistent with the agency's priorities and supports the achievement of its strategic outcome. It also supports the achievement of the Action Plan's objectives by increasing the knowledge base on contaminant levels and potential impacts on health, and consequently contributes to the Action Plan's ultimate outcome (i.e., reducing the health risks to Canadians from harmful environmental contaminants).

4.1.2.2 The federal government's and Statistics Canada's roles in implementing the Canadian Health Measures Survey program

"Given that Statistics Canada is the mandated statistical body for the country it makes sense to do that work and make data available."

External researcher

The Constitution Act, 1867 establishes "census and statistics" as an area of federal jurisdiction. Parliament has exercised its responsibility for the census and statistics primarily through the Statistics Act. The act creates Statistics Canada as Canada's national statistical office and establishes its mandate, powers and obligations. Under the act, Statistics Canada must collect, compile, analyze and publish statistical information on the economic, social and general conditions of the country and its people. Under Section 22-c of the Statistics Act, Statistics Canada has the mandate to collect, compile, analyze, abstract and publish statistics on health and welfare.

"Statistics Canada has the infrastructure and the reputation of trust, confidentiality and neutrality to achieve the high response rates and thus produce high quality information."

Researchers

Key informants indicated that while health is a provincial jurisdiction, there is an important role for the federal government to provide credible, nationally representative survey measures for comparability, consistency, and objectivity. According to them, "there is nobody in a better position than the federal government to create and maintain a statistical program of the magnitude, visibility, and complexity of the CHMS. Internationally, similar programs are also delivered at a federal level."

According to senior management, CHMS was designed in response to a data gap on the general health of the population, based on direct health measures. If CHMS no longer existed, the impact would represent a step backwards in how we understand health in Canada, especially for children aged 3 to 11, as there are currently no data available for this age group. This evaluation finding is reaffirmed by interviewed researchers. Without CHMS, researchers will lose the ability to adjust self-reported data based on objective direct measures. In health policy, a lack of CHMS data may impair decision making and program outcomes, as CHMS provides the clearest picture available of health issues and their determinants for Canadians.

4.2 Performance

4.2.1 Effectiveness – Achievement of outputs and expected outcomes

The expected outcomes measured in this evaluation are specified in Statistics Canada's Action Plan to Protect Human Health from Environmental Contaminants Roll-up Evaluation Framework. The Results-based Management Accountability Framework established in 2008 for the Action Plan presents a timeline to reach each outcome. The evaluation findings have focused on measuring expected outcomes at the immediate level because of CHMS's life cycle and the timing of this evaluation. However, there is some evidence demonstrating that CHMS is on track and progressing toward its intermediate and long-term outcomes.

Prior to assessing if outcomes have been achieved, it is important to understand the delivery of the outputs as stated in the Action Plan, as they are the catalysts through which expected outcomes are achieved.

4.2.1.1 Outputs

Evidence indicates that CHMS activities are carried out as planned, and the CHMS outputs have been produced as expected.

The status of CHMS activities, as of August 2012, has already been presented in Table 1. As evident, Cycle 1 is fully completed, Cycle 2 is in its final dissemination phase, and Cycle 3 collection is in progress.

Based on administrative data and interviews with program management, the evaluation findings demonstrate that CHMS outputs were produced as planned. Indeed, the Action Plan stated that CHMS should have 5,000 participants in cycles 1 and 2 and 2,500 participants in Cycle 3 (at the time of the evaluation), and that children under the age of 6 should be included in the sample. In terms of measures, the Action Plan required additional environmental measures, including air and water samples.

Table 6 depicts the completion status of the outputs and demonstrates that the targets outlined in the Action Plan were achieved.

In addition, as observed by reviewing the administrative data, biomonitoring data were collected from the 3-to-5 age group, predominantly in cycles 2 and 3. There were 135 biomonitoring measures Footnote 24 taken from children under the age of 6 in Cycle 2, and there are 88 such measures currently being collected from this age group in Cycle 3, compared with only 1 measure taken from this age group in Cycle 1.

Moreover, a range of environmental exposure measures have been collected, including biospecimens (all three cycles) and indoor air samples (cycles 2 and 3). Also, a new source of measures within the environmental exposure theme has been included in Cycle 3: tap water. Footnote 25

Table 6 Status on outputs, as of August 2012
Activity Outputs Cycle 1 Cycle 2 Cycle 3
Status (quantity) Status (quantity) Status (quantity)
Collection   Planned number of participants: 5,6001 Planned number of participants: 5,700 Planned number of participants: 5,700
Household questionnaires Completed (6,604) Completed (7,830) In progress
Physical measures data Completed (5,617) Completed (6,432)
Physical activity monitor Completed Completed
Biospecimens Blood Completed (5,385) Completed (6,163)
Urine Completed (5,552) Completed (6,363)
Biobank samples Collected Collected
Indoor air samples Not applicable Completed
Tap water samples Not applicable Not applicable
Dissemination Dissemination plan Available completed Available completed Too early
Daily releases 10 In progress
Microdata files 10
Custom tables Approximately 40
Peer-reviewed articles in Health Reports 23
Peer-reviewed articles in other journals Approximately 30
Reports 5
Fact sheets 14
Summary tables 65
4.2.1.2 Immediate outcome

Reliable and usable data for decision makers, researchers and Canadians on the baseline health status of Canadians, and on the level of, and exposure to environmental contaminants.

Evidence indicates that CHMS data are reliable and usable, and that they are accessible through shared files with partners, as well as through Statistics Canada's RDCs.

Through the evaluation study, users expressed some concerns with the comparability and analytical power of CHMS data, and shared their challenges with the availability of the data.

Reliability and usability

Most of the key informants interviewed stated that CHMS data were reliable and robust. Three-quarters of the survey respondents indicated that they were satisfied or very satisfied with the data's reliability and validity.

Furthermore, evidence from administrative data demonstrates a high response rate for Cycle 1 (88% for the household questionnaire and 85% for the mobile clinic) and for Cycle 2 (90% for the household questionnaire and 82% for the mobile clinic), which is a valid indicator of the quality of the information produced.

Although researchers indicated that they were satisfied with the reliability and usability of CHMS data, they raised the following two concerns: 1) comparability of the data and 2) inability to conduct analysis at the sub-national level.

Concern 1 – Comparability of data

According to interviewees, there are many valid ways to measure health indicators and environmental contaminants, but if they are not measured the same way from one cycle to the next or from one survey to the next, they are difficult to compare for analytical purposes. For example, CHMS measures cadmium (or caffeine) in urine, while NHANES measures it in blood—both are good measures, but are not comparable.

The following possible causes for measuring things differently were mentioned by researchers: 1) laboratory measures are victim to various laboratory methods, measurement approaches, measurement errors and changing measurement protocols and 2) competing interests and money.

Concern 2 – Inability to conduct analysis at the sub-national level

Most researchers mentioned that because of the size of the sample, they were unable to conduct analysis at the sub-national level or more complex analysis on specific subsets of the population (e.g., vulnerable populations, people with rare conditions). This, in turn, limited the reliability and the usability of the data. However, the impact of the limitations is not consistent across content themes (e.g., no issues with physical activity were mentioned).

According to program documentation, CHMS was established to be a national survey and therefore provide national estimates. Conducting analysis at a sub-national level was not in scope for the initiative.

Availability

"The RDCs are providing such a fabulous service to researchers. But one has to really know what to do; it is not a fishing expedition. If they know what they want to do they get support."

Internal manager

Evaluation findings revealed that data are not as accessible as they could be. Some of the comments collected are related to the unavailability of microdata files for CHMS on the Internet and the fact that researchers must go to RDCs to use CHMS data files. In terms of RDCs, internal and external researchers raised various concerns, such as a lack of awareness of how to access the data; timelines to get proposals approved (sometimes up to six months); issues with location and times (days and hours) of use; restrictions on material to be brought in and out; and, in some locations, the ability and knowledge of staff to work with the CHMS data files. Footnote 26

Findings from internal interviews revealed that no issues were identified by CHMS management regarding the availability and accessibility of CHMS data. According to management, the two federal partners have shared files and external researchers have full access to the RDCs. As for the primary data users surveyed, almost three-quarters confirmed this evidence and answered that they were satisfied to very satisfied with awareness on availability. Moreover, CHMS management also stated that a communication strategy will be launched in fall or winter 2013 to make sure that all information about accessing CHMS data and Biobank data is available and visible to the public. Because of the timing, however, it was not possible for the evaluation to validate this statement.

4.2.1.3 Intermediate outcome

Increased awareness among those who use the collected data and information

Evaluation findings demonstrated evidence of awareness among the general public, researchers and data users as a result of publications based on Cycle 1 data. However, it is still too early to present valid measures of increased awareness as the release of Cycle 2 data and its related products is currently in progress. Furthermore, some issues such as the 'reach' of publications and the lack of promotion of CHMS data to scientific communities were identified as limitations to awareness, in addition to the timing of this evaluation.

According to CHMS administrative data, the general public are interested in Cycle 1 data, as seen in the number of published newspaper articles, TV and radio broadcasts, Internet articles, and website hits, presented in Table 7.

Table 7 Media uptake, January 2010 to March 2012
Type of media Total from Cycle 1
number
Newspaper articles (printed) 171
TV and radio broadcasts 37
Internet articles 248
Twitter entries (tracked only for the first week after the release) 63
Other articles 463

As Table 8 summarizes, the general public has also shown an interest in and awareness of the CHMS products available on the Statistics Canada website.

Table 8 Website hits, April 2009 to December 2011
  Total from Cycle 1
number
Total (English and French) 184,412
For summary tables only 6,973
For metadata (dictionaries, etc.) 7,216
Data table downloads1 3,519
Data tables page views1 13,858

Finally, Table 9 indicates that external researchers have expressed an interest in Cycle 1 data based on the number of research proposals submitted (64), inquiries to client service areas and requests for custom tabulations.

Table 9 Requests for data, inquiries and custom tabulations
  2008/2009 2009/2010 2010/2011 2011/2012
number
Requests received for Biobank use for research studies ... ... ... 6
Research proposals using CHMS data (university and federal RDCs) ... 4 28 32
Inquiries and requests addressed by client services (from January 2010 and December 2010) ... ... 194 ...
Custom tables prepared based on requests (from January 2010 and December 2010) ... ... 115 ...

Although all three tables demonstrate that there is interest in, and awareness of CHMS data, it is difficult to quantify the level of awareness. These numbers, however, can form the baseline against which to measure increased awareness as future CHMS data cycles become available and products are released.

According to all key informants, awareness of data availability exists among researchers to varying extents. It was also noted that the data are still new (Cycle 1 data were released from January 2010 to April 2011) and that, as more publications come out, awareness should increase. However, even with these efforts, interviewees indicated that there may continue to be some issues in increasing awareness because of the 'reach' of the publications and the lack of active promotion of the data to scientific communities.

4.2.1.4 Long-term outcome

Decision makers increasingly use the information on the associations between contaminants and illness to guide decision making in public health practice, research, policy regulation, and programs and services.

The evaluation findings demonstrated evidence that CHMS has already played a role in validating data from other self-reported surveys as well as been used in research and for scientific discovery. More specifically, there was evidence of use for policy and decision making in the areas of physical activity, environmental exposure, nutrition markers and oral health; however, it is still too early in the program's life cycle to demonstrate CHMS's full potential.

Despite the many examples of use, the limited accessibility of the data has been raised by key informants as a potential factor that could detract from its increased use (CHMS expected outcome).

Evidence of use

Canadian Health Measures Survey data used within Statistics Canada

Internal researchers highlighted specific areas where CHMS data are used within Statistics Canada. For example, CHMS helps validate other self-reported surveys, such as the Canadian Community Health Survey. It also helps improve micro-simulation models (by increasing the confidence of projections), which are developed by Statistics Canada and used by policy makers for creating or updating policies.

Canadian Health Measures Survey data used in decision making

External researchers provided examples from their area of expertise of where CHMS data have been used. These examples mainly reflect CHMS's contribution to

  • determining priorities and pressing issues
  • managing risk within HC and PHAC
  • making strategic and financial decisions
  • planning for the future
  • updating guidelines and regulations
  • making international comparisons
  • informing global negotiations.

In the same vein, almost all primary data users from the evaluation survey had used CHMS data for research purposes. Of them, 50% had used CHMS data for policy analysis, and some of them had used the data for program and service development.

Table 10 depicts the most significant examples to date of CHMS data and research used to influence policies and regulations in four thematic areas: environmental exposure, physical activity, nutrition markers, and oral health.

Table 10 Evidence of use of Canadian Health Measures Survey data
Theme Evidence of use Organization and year of publication
Environmental exposure Report on Human Biomonitoring of Environmental Chemicals in Canada Health Canada (HC), 2010
Physical activity The Canadian Physical Activity Guidelines HC and Canadian Society for Exercise Psychology, 2011
Canadian sedentary behaviour guidelines for children and youth
Active Healthy Kids and Canada Report Card on Physical Activity for Children and Youth National organization Active Healthy Kids Canada, 2011, 2012
Nutrition markers Updated guidelines related to folic acid intake in women of child-bearing age, based on results of Vitamin B12 and red blood cell folate Society of Obstetricians and Gynecologists of Canada, 2011
Clinical Utility of Vitamin D Testing: An Evidence-based Analysis (report) Ontario Medical Advisory Secretariat, 2009
Vitamin D data informed the debate regarding the update of the recommended intake guidelines Institute of Medicine, USA, 2011
Oral health Summary Report on the Findings of the Oral Health Component of the Canadian Health Measures Survey 2007-2009 HC, 2010
Canada's Oral Health Report Card: A call to action Canadian Dental Hygienists Association, 2010
Oral Health – More than Just Cavities (report) Ontario's Chief Medical Officer of Health, 2012

The selected examples of use are drawn from a very short period (approximately 2.5 years since the data were released). Key informants underlined that "it is a very new survey on the grand scheme, having been around for such a short period of time." Nonetheless, they believed that CHMS's reputation is high and it will have a positive impact greater than any other national survey. They testify that "in various meetings they attended, under various hats, in the physical activity sector, or health sector, the survey comes out and is being mentioned as a useful and respected source of information," that it generates a lot of attention around decision-making tables at HC and PHAC, and that, even though "they might not be in a stage to have a published document per se, [..] they are certainly making decisions based on it."

4.2.1.5 Potential factors that contribute or detract from the achievement of expected outcomes as identified by key informants

Accessibility as a barrier to use

"There is wonderful data collected, the survey should be proud, but if it stays inside it is useless to advance science, inform policy, etc."

External researcher

The evaluation findings revealed some issues with access to the data that may lead to adverse effects on their use for research and policy development. Specifically, though most researchers are aware of the data, many are unaware of how to access them. This view is corroborated by external interviewees who believe that the lack of accessibility (or complexity of access) is a barrier to the use of the data.

Documents reviewed revealed that an analytical plan, a strategic document describing the CHMS analytical program and dissemination strategy, was developed in 2005 and updated several times—the last time in 2010. This plan was intended to be a living document that would facilitate the optimal use of CHMS data and define the scope of the analysis to maximize the data's impact and utility. The plan outlines the analytical objectives and the analytical activities related to the survey, including those that support them.

According to external researchers, most of the deliverables that were specified in the plan were completed, but the interviewees were presently unaware of the current analytical plan. They felt that there should be better communication and better collaboration to avoid duplicating research and analysis should different research groups look at the same data.

In the same vein, the evaluation study found that the program does not formally and systematically track all publications that use CHMS data, and interviews with external researchers confirmed that they do not always inform CHMS management about their peer-reviewed publications. This fact contributes to the previously identified weaknesses with regard to coordination and better communication of analytical efforts among researchers and other interested parties. The need for comprehensive, formal and up-to-date information on all studies and publications that are based on CHMS data was clearly identified by several external interviewees as a prerequisite for good collaboration. Researchers would know whether someone has already studied and published results on their potential focus of interest before starting their studies; thus, "they would be able to plan and scope their research better."

According to CHMS management, starting this fall, CHMS will be part of the Health Statistics Division analytical planning process, which is conducted every year. The analytical plan will be produced based on consultations with key stakeholders. As part of this planning exercise, decisions will be made about future articles and shared responsibilities between Statistics Canada, HC, PHAC and external experts.

Other factors

Budgetary restrictions and financial pressure were specified by most of the interviewees as a potential factor that could detract from the achievement of expected outcomes. External researchers (cost-recovery clients) specified cuts to cost-recovery funding, which would mean shorter content and fewer measures.

Program management mentioned that the sustainability of the clinic staff and the knowledge that resides in a few key people could potentially affect the delivery of CHMS. In fact, they are currently developing a new human resources strategy for the mobile clinic staff and are trying to ensure that knowledge transfer occurs. It was also specified during interviews that they are working on a succession planning strategy for mitigating the risks associated with the lack of knowledge transfer; however, no evidence became available within the timeframe of the evaluation study.

One external researcher mentioned a possible "perception danger" as a factor that might affect the survey. He feared that because a trend cannot be observed at the present time, it may be perceived that the measures are not worth the money. He explained that some trends can take a very long period of time (e.g., 20 years) to emerge.

4.2.1.6 Unintended impacts from the implementation of the Canadian Health Measures Survey

There were no significant unintended impacts revealed by this evaluation. Given the short period of time CHMS has been in place, it may be too early to observe unintended outcomes.

According to program management, however, since the survey was modeled after NHANES—a survey that has existed for the last 50 years—there was an opportunity to take into consideration their experience and lessons learned, which, in turn, decreased the likelihood of unintended impacts. According to the internal interviewees, the complexity of the survey and the required extensive and thorough planning process to ensure the controlled environment of operations and quality of data further limits the likelihood of variations and unintended effects occurring and being observed.

In terms of positive impacts, a couple of examples were mentioned by interviewees: 1) some individual medical problems were identified and respondents were notified as soon as possible, which allowed them to seek advice from their doctor in a timely manner; 2) some environmental contaminant data that were collected as supplementary became needed earlier than expected. For example, measures of selenium were included in the metal profile along with other planned measures, because it was easy to include them with the same collection and laboratory efforts. Recently, they became needed for a selenium risk assessment that was not in the plans during collection, as nationally representative data on selenium.

4.2.2 Performance - Efficiency and Economy

The evaluation findings on the efficiency and economy of CHMS are based on financial analysis of resource availability and use, and trend analysis of expenditures and variances. Footnote 27 Anecdotal information obtained through interviews provides key informants' views and perceptions regarding the efficiency of the program, the alternative approaches that might increase efficiency, and the extent to which CHMS duplicates or complements other surveys or existing programs.

The shift from predominantly cost recovery to core funding of CHMS as a result of the 2008 Action Plan helped stabilize the survey, which had a positive impact on CHMS. In particular, it allowed for better long-term planning and for seeking opportunities to increase operational efficiency, which could be demonstrated more clearly after the implementation of performance data collection on efficiency over time.

The use of existing CHMS infrastructure by other federal partners allows them to achieve their results while reducing time and costs. Sharing complementary knowledge between Statistics Canada and other federal partners and academia increases the analytical capacity in the health domain.

CHMS is the only nationally representative direct measures survey in Canada and complements other Canadian and international studies. Therefore, there is no duplication of effort that might influence efficiency.

External interviewees identified areas that could increase efficiency and economy, and suggested potential alternative approaches to some aspects of conducting the survey (e.g., flexible hiring practices and the use of local clinics and infrastructure instead of mobile examination clinics). However, these options could impair the quality of data collection.

4.2.2.1 Resources in place to support the implementation of the Canadian Health Measures Survey

"The dedicated funding (until this fiscal year) helped tremendously to establish CHMS. We now have a nice work flow that is consistent and predictable as opposed to being at the mercy of renewal of the funding every year. We were able to plan better long-term."

Internal interviewee

Evidence shows there was a clear shift from cost recovery to core funding after the 2008 Action Plan, as detailed in charts 3 and 4 below. Cost recovery changed from 90.6% in 2007/2008 to 24.2% in 2011/2012; core funding changed from 9.4% in 2007/2008 to 75.8% in 2011/2012. Many interviewees confirmed the Action Plan's positive impact by stabilizing the funding for CHMS.

 

Chart 3 Core funding, as of April 2012

Bar graph displaying resources allocated over the years
Description: Chart 3 Core funding, as of April 2012
  • The title of the graph is "Chart 3 Core funding, as of April 2012."
  • This is a column clustered chart.
  • There are in total 5 categories in the horizontal axis. The vertical axis starts at 0 and ends at 40,000 with ticks every 5,000 points.
  • There are 3 series in this graph.
  • The vertical axis is "Resource allocation($ thousands), Resource allocation($ thousands)."
  • The units of the horizontal axis are years from 2008 to 2012.
  • The title of series 1 is "Core funding (Treasury Board Submission) ."
  • The minimum value is 1,251 occurring in 2008.
  • The maximum value is 35,528 occurring in 2012.
  • The title of series 2 is "Budgeted funding ."
  • The minimum value is 1,761 occurring in 2008.
  • The maximum value is 34,165 occurring in 2012.
  • The title of series 3 is "Actual Expenditures ."
  • The minimum value is 2,062 occurring in 2008.
  • The maximum value is 33,043 occurring in 2012.
Data for Chart 3 Core funding, as of April 2012, Resource allocation ($ thousands), Resource allocation
($ thousands)
Year Core funding (Treasury Board Submission) Budgeted funding Actual Expenditures
2008 1,251 1,761 2,062
2009 9,841 9,492 8,714
2010 12,235 11,235 10,653
2011 12,201 11,677 11,614
2012 35,528 34,165 33,043

Chart 4 External cost-recovery funding, as of April 2012

Bar graph displaying resources allocated over the years
Description: Chart 4 Total Canadian Health Measures Survey resource allocations (core and external cost recovery), from 2007/2008 to 2012/2013
  • The title of the graph is "Chart 4 External cost-recovery funding, as of April 2012."
  • This is a column clustered chart.
  • There are in total 5 categories in the horizontal axis. The vertical axis starts at 0 and ends at 25,000 with ticks every 5,000 points.
  • There are 2 series in this graph.
  • The vertical axis is "Resource allocation($ thousands)."
  • The units of the horizontal axis are years from 2008 to 2012.
  • The title of series 1 is "Planned cost-recovery funding ."
  • The minimum value is 3,721 occurring in 2011.
  • The maximum value is 23,780 occurring in 2012.
  • The title of series 2 is "Actual cost-recovery expenditures."
  • The minimum value is 3,713 occurring in 2011.
  • The maximum value is 23,724 occurring in 2012.
Data for Chart 4 External cost-recovery funding, as of April 2012, Resource allocation
($ thousands)
Year Planned cost-recovery funding Actual cost-recovery expenditures
2008 11,213 11,258
2009 4,164 4,168
2010 4,682 4,585
2011 3,721 3,713
2012 23,780 23,724

All variances between CHMS's current budget and actual expenditures for both core and cost-recoveries are less than 4% over the five-year period. Financially, this indicates good resource management. Internal interviewees revealed that this could be attributed to the careful and thorough planning required by the complexity of the survey.

In terms of number of FTEs, the budgeted number almost always underestimated the actual number required, as detailed in Table 11. The effect was that non-salary funds had to be re- profiled to salary funds.

Table 11 Number of full-time equivalents
  Full-time equivalents
2007/2008 2008/2009 2009/2010 2010/2011 2011/2012 2012/2013
number
Core budgeted 0 13 54 84 78 ...
Core actual 2 17 93 96 119 ...
Cost-recovery budgeted 57 77 0 0 0 ...
Cost-recovery actual 69 74 0 0 0 ...

The trade off of cost versus quality

Analysis of financial data on operational expenditures (e.g., equipment and assets) revealed that maintaining the mobile examination clinics (MECs) (e.g., the medical and lab equipment, as well as the trained and knowledgeable staff who travel with the clinics) had accounted for the program's most significant costs. When looking at the assets, it is noticeable that the most significant categories are medical and laboratory equipment—the approximate cost of one MEC is $335,000, with 35,000 for the trailer and $300,000 for the equipment inside. It is also clear that the MECs currently in operation are almost at the end of their life cycle (their overall percent value is 10%). Consequently, finding opportunities to optimize the cost and resources used in MEC operations could be significant for CHMS's performance in terms of efficiency and economy.

"If we took only one of the objectives (say, biomonitoring only), you would not necessarily need mobile clinics across the country. However, if you are interested in having this comprehensive view of the overall health of Canadians and want to correlate it with lab measures, the way the survey is designed is the most efficient way to achieve this consistency and the comparability of the results over time."

Interviewee

Program management and some experts admit that the cost of collecting direct measures through MEC is high. However, they feel that controlling the MEC environment is the key to ensuring high-quality results. Even a simple inconsistency in the environment—for example, odours—could affect the measures. In the end, data quality was paramount, and directly linked with high costs.

Furthermore, interviews and document review indicated that the method of collecting health measures in mobile clinics, although expensive, is the one used by other countries and surveys (for example, NHANES). The decisions made indicated that the model selected aimed to have the same level of rigor and consistency.

However, some external researchers, while recognizing the importance of controlling the environment around data collection and keeping it consistent, still believe that other alternatives, such as the use of local existing facilities and structures, is worth further consideration for the purpose of increasing efficiency without compromising quality. They also indicated that the program's efficiency and responsiveness may be improved by using more flexible hiring procedures (e.g., hiring people external to the public service, whenever possible).

Evidence of the program's effort to generate efficiencies

Interview findings indicated evidence of the program's effort to apply good stewardship and to use resources in an economic way. For example, when selecting locations for trailers (e.g., using federal or public locations for a nominal fee over private, for-profit locations to minimize rental costs), and in contracting out some services, such as maintenance and IT support, to local companies rather than sending maintenance personnel from head office. As specified by the program managers during interviews, they are trying to minimize the overtime and travel costs of personnel.

In addition, as the program matured, its operating structure was re-organized through mergers. This generated cost-savings through lower salary expenditures, while at the same time providing a more efficient management environment. The current CHMS management continues to look for internal operational efficiencies through regular operational reviews.

4.2.2.2 Possible efficiencies that are being gained as a result of the Action Plan

Internal and external interviewees agree that the CHMS infrastructure provides to HC and PHAC the opportunity to reduce their program costs. They can invest their funding in additional health measures rather than in developing their own parallel infrastructure.

"We are the national statistical agency and we are working well within our national mandate. They are using existent, well established, highly functioning national infrastructure. That is being of itself one definition of efficiency."

Internal interviewee

CHMS worked in collaboration with HC's First Nations and Inuit Health Branch (FNIHB) when biomonitoring the First Nations population, which is part of FNIHB's responsibilities under the Action Plan. According to CHMS program management, using comparable methods, procedures, tools, and reference labs created efficiencies for FNIHB at the operational level, but also provided better comparability of the results because the same measures were used in First Nations population as in the general Canadian population. However, program management believes that there is room for even more collaboration between the two initiatives.

Internal and external researchers suggested that a collaborative approach in research provides the best opportunities to combine the strengths of all parties involved, and to increase the knowledge capacity and the analytical power of the studies. Statistics Canada's researchers can leverage their technical capacity, including easy access, manipulation and statistical interpretation of the data. External researchers tend to be the subject-matter experts and thus, they are well positioned to interpret, produce and share results with the scientific community. Through this collaboration, CHMS contributes to the increased analytical capacity in the health domain.

4.2.2.3 Possible duplications, complementarities or alternatives

Evaluation findings revealed that most key informants believed there is no duplication of CHMS with other provincial or international surveys. In fact, right now in Canada there are no other nationally representative direct health measures survey data. While several provincial surveys do exist (e.g., Ontario Health Survey), most of them do not include physical measures or they use self-selected sampling, which, according to program management, could cause bias issues (e.g., only healthy people may choose to participate). Internationally, many countries (approximately 20) are conducting similar programs, and, according to internal management interviews, there are efforts to compare the results of individual countries.

As previously mentioned, evidence has also pointed to the fact that CHMS plays a role in providing complementary data to other kinds of surveys (e.g., self-reported, administrative) collected in Canada, as it helps to validate and adjust them.

Although all key informants believed that CHMS should be funded by the federal government, some external researchers did raise the question of whether the federal government should be conducting all activities versus contracting out some of the work. Alternative models for program delivery were suggested, such as completing some of the CHMS work by non-federal government contracts; however, this alternative may impact respondents' level of trust in CHMS, as well as CHMS's response rate and data quality.

5. Conclusions

5.1 Relevance

CHMS represents a unique program in Canada, providing direct health measures data to support health research, policy, and decision making. Evidence shows that the program is relevant to Canadians and health organizations, with a clear present and future need for the program. Although most feel that CHMS is responsive to current and emerging content needs, there was no content plan ever developed for CHMS.

CHMS is well aligned with the priorities of the federal government and Statistics Canada. The federal government is in the best position to deliver CHMS, and has specific legislative obligations being met through the program, making the delivery of the program a legitimate, appropriate and necessary activity.

5.2 Performance – Effectiveness

CHMS outputs have been produced as expected and the expected immediate outcome is being achieved: reliable and usable data are available on the baseline health status of Canadians, and on the level of, and exposure to environmental contamination. Some issues exist around the accessibility of the data, which could impair the long-term outcomes if not addressed. More specifically, even though most researchers are aware of the data, many are unaware of how to access them. This view is corroborated by external interviewees, who believe that the lack of accessibility (or complexity of access) is a barrier to the use of the data. Some key informants commented on the unavailability of microdata files for CHMS on the Internet and on how researchers must rely on the RDCs to use CHMS data files. Internal and external researchers raised various concerns related to RDCs, such as a lack of awareness of how to access the data, and timelines to get proposals approved (sometimes up to six months).

Evaluation findings demonstrated evidence of awareness of CHMS data among the general public, researchers and data users as a result of publications based on Cycle 1 data. However, issues such as the 'reach' of publications and the lack of promotion of CHMS data to scientific communities were identified as limitations to awareness, in addition to the timing of this evaluation. Researchers felt that there should be better communication and better collaboration to avoid duplicating research and analysis should different research groups look at the same data.

The evaluation findings provided evidence that CHMS data are being used for research, policy, and decision making—the expected long-term outcome. Researchers have already used CHMS data to validate self-reported data from other surveys, as well as in research for scientific discovery. In addition, CHMS data are used for policy and decision making in the areas of physical activity, environmental exposure, nutrition markers and oral health. However, it is still too early in the program's life cycle to broadly demonstrate CHMS's full potential. Information on publications (research studies) are only tracked internally and not shared with external stakeholders, making the collaboration and reporting, in some instances, more difficult.

5.3 Performance – Economy and efficiency

The shift from predominantly cost recovery to core funding of CHMS as a result of the 2008 Action Plan helped stabilize the survey, which had a positive impact on CHMS. In particular, it allowed for better long-term planning and for seeking opportunities to increase operational efficiency. The evaluation results demonstrated sufficient human and financial resources to support the program. Some external parties have indicated that increased efficiency and economy could be achieved through more flexible hiring practices and the use of local clinics and infrastructure as an alternative to the mobile clinics. While these options might be worth further consideration, their feasibility must be assessed against the extent to which they could impair the quality of data collection.

Furthermore, the use of existing CHMS infrastructure by other federal partners allows them to achieve their objectives in an efficient way while reducing time and costs. Sharing complementary knowledge between Statistics Canada, other federal partners and academia increases the analytical capacity in the health domain.

CHMS is the only nationally representative direct measures survey in Canada and complements other Canadian and international studies. Therefore, there is no evidence of duplication of efforts that might influence efficiency.

6. Recommendations

Three recommendations emerge from the evaluation findings that advocate for enhanced planning and coordination with various stakeholders with respect to content and analytical studies, improved accessibility of CHMS data, and strengthened abilities for systematic performance measurement to demonstrate the achievement of results.

Recommendation 1 – Relevance

It is recommended that the management of CHMS enhance planning and external coordination with researchers and stakeholders—both internally within the federal government and externally with researchers from academia—with respect to content determination (content plan) and the planning of analytical research based on the content.

Recommendation 2 – Performance

It is recommended that the management of CHMS increase awareness of the data and improve their accessibility by promoting CHMS to a wider audience of potential clients and users, and by providing support on how to use the data.

Recommendation 3 – Performance measurement

It is recommended that the management of CHMS improve the performance measurement system, as a tool, to systematically collect performance data and monitor the progress toward achieving its outcomes.

3a) Improving data

To accurately assess the achievement of the long-term outcome and the impact of CHMS, it is recommended that management of the CHMS ensure that a formal tracking, such as the Client Relationship Management System (CRMS), is in place for publications and studies based on CHMS data, to demonstrate their use. The information on publications using CHMS data must be shared on a recurrent basis with researchers and stakeholders to enhance further external coordination.

3b) Financial information

It is recommended that accurate financial information is made available by the Finance Branch to support CHMS performance data collection to demonstrate the level of efficiency over time for decision making and accountability reporting.

7. Management response and action plan

Recommendation 1

Relevance:
It is recommended that the management of CHMS enhance planning and external coordination with researchers and stakeholders—both internally within the federal government and externally with researchers from academia—with respect to content determination (content plan) and the planning of analytical research based on the content.

Statement of agreement or disagreement

Management agrees with this recommendation.

Management Response

The evaluation reinforced management's strategic objective this fiscal year to develop a longer-term content plan for CHMS to ensure ongoing relevance. This work has already begun and will be completed within the next 12 months. In addition, the evaluation provided evidence that better planning for analytical work across all three federal departments—Statistics Canada, PHAC and Health Canada—would be a valuable new strategic objective. A Terms of Reference has been drafted for a new CHMS Analytical Working Group, which will be chaired by Statistics Canada and composed of representatives from PHAC and Health Canada.

Recommendation 1
Timeline Deliverables Responsible party
January 1, 2014 Eight-year content plan Director, Health Statistics Division (HSD)
April 1, 2013 CHMS Analytical Working Group with PHAC and Health Canada Director, HSD

Recommendation 2

Performance:
It is recommended that the management of CHMS increase awareness of the data and improve their accessibility by promoting CHMS to a wider audience of potential clients and users, and by providing support on how to use the data.

Statement of agreement or disagreement

Management agrees with this recommendation

Management response

Increasing the use and awareness of CHMS is an important objective that has been clearly identified within the evaluation. Management agrees that more work can be done in this area and has developed three clear initiatives to better promote and support the use of CHMS. First, we have begun training additional staff with HSD's Client Services area to respond to external client service requests for information on CHMS and for custom tabulations using CHMS data. Second, management is developing a communications plan to increase awareness of both the CHMS data within the RDCs and the availability of the Biobank samples for researchers. Third, management is considering holding more workshops across the country to increase users' capacity to analyze CHMS data. This last strategy, however, requires resources to pay for travel and additional costs to conduct such workshops. Achieving this objective will depend upon our ability to identify sufficient resources to increase the number of workshops held over the next two years.

Recommendation 2
Timeline Deliverables Responsible party
April 1, 2013 Two additional trained staff Director, HSD
April 30, 2013 Communications plan Director, HSD
Ongoing Workshops Director, HSD

Recommendation 3

Performance measurement
It is recommended that the management of CHMS improve the performance measurement system, as a tool, to systematically collect performance data and monitor the progress toward achieving its outcomes.

3a) Improving data
To accurately assess the achievement of the long-term outcome and the impact of CHMS, it is recommended that management of the CHMS ensure that a formal tracking, such as the Client Relationship Management System (CRMS), is in place for publications and studies based on CHMS data, to demonstrate their use. The information on publications using CHMS data must be shared on a recurrent basis with researchers and stakeholders to enhance further external coordination.

3b) Financial information
It is recommended that accurate financial information is made available by the Finance Branch to support CHMS performance data collection to demonstrate the level of efficiency over time for decision making and accountability reporting.

Statement of agreement or disagreement

3a) Management agrees with this recommendation.
3b) Management agrees with this recommendation

Management response

3a) Management already has an adequate tracking system to monitor publications and studies based on CHMS data; however, it has not been systematically shared with external researchers. As such, management is proposing to post this list on our external CHMS website and update it annually.

3b) Management is undertaking an operational review of the CHMS Operations Section to ensure that processes and practices are optimal. This review will be completed over the next six months and will lead to implementing performance indicators to measure the efficiency of the program.

Financial Branch Action Plan:
The Finance Branch will develop standard reports and financial indicators in support of all programs. The organization has detailed financial information already available that can be easily formatted to better address evaluation needs of programs and be made available on a timely basis. The determination of the format and elements to be measured will be developed in collaboration, and validated with program managers and the departmental evaluation group. The final recommendation will be presented and approved by the Administrative Practices Committee.

Recommendation 3
Timeline Deliverables Responsible party
December 2013 Tracking list on website Director, HSD
September 2013 Operational review report Director, HSD
January 2014 Efficiency indicators Director, HSD
Financial Branch Action Plan
Timeline Deliverables Responsible party
March 2014 Key standard program financial indicators approved DG, Finance
As per Schedule to develop Program PM Strategy (see RBAEP) Program financial coding structure reviewed and adjusted to support Program Performance Measurement Framework. DG, Finance and Director, HSD
Communications Branch Action Plan
Timeline Deliverables Responsible party
January 2014 Development of a formal tracking system DG, Communications

Figures

Figure 1: Canadian Health Measures Survey Logic Model

Model of Canadian Health Measures Survey Logic
Description: Figure 1 Canadian Health Measures Survey Logic Model
  • This figure depicts the logic model for the Canadian Health Measures Survey.
  • The inputs consist of core funding, cost-recovery funding agreements and full-time equivalents.
  • Part of the inputs supports the program activities (planning, collection and dissemination) directly; the rest of the inputs support an infrastructure (for example, trailers and laboratory facilities) that serves as an enabling function allowing the program activities to occur.
  • The activities consist of planning, collection and dissemination.
  • The outputs of planning consist of approved survey content, survey methodology, system tools and equipment, guides and protocols, QA/QC procedures and trained collection staff. These outputs are used as inputs for collection.
  • The outputs of collection consist of completed household questionnaires (all cycles), physical measures data (all cycles), indoor air samples (in cycles 2 and 3), tap water samples (only in cycle 3), biospecimens (all cycles), biobank samples (all cycles) and physical activity monitors (all cycles). These outputs are used as inputs for dissemination.
  • The outputs of dissemination consist of Daily releases, microdata files, custom tables, peer-reviewed articles in Health reports and other journals, fact sheets and summary tables. The dissemination outputs leads to the immediate outcome
  • The immediate outcome is: Reliable and usable data for decision makers, researchers and Canadians on the baseline health status of Canadians, the level of, and exposure to environmental contaminants. The immediate outcome leads to the intermediate outcome.
  • The intermediate outcome is: Increased awareness among data/information users of the collected data/information. Finally, the intermediate outcome leads to the long-term outcome.
  • The long-term outcome is: Decision makers increasingly use the information on the associations between contaminants and illness to guide decision-making in public health practice, research, policy regulations and programs and services.

Appendices

  • Appendix 1: Canadian Health Measures Survey process map
  • Appendix 2: A composite logic model for all activities under the Action Plan
  • Appendix 3: Canadian Health Measures Survey (CHMS) evaluation matrix
  • Appendix 4: A list of publications reviewed as part of the literature review

All appendices are available upon request. Please contact AEB-Professional-Practices@statcan.gc.ca

Notes:

Footnote 1

For more information on the joint REB for HC and PHAC, see www.hc-sc.gc.ca/sr-sr/advice-avis/reb-cer/index-eng.php.

Return to footnote 1 referrer

Footnote 2

REB website, www.hc-sc.gc.ca/sr-sr/advice-avis/reb-cer/index-eng.php.

Return to footnote 2 referrer

Footnote 3

CHMS core funding includes only the operating budget and excludes the employee benefit plan and PWGSC accommodation costs.

Return to footnote 3 referrer

Footnote 4

The 2010 Annual Report on the Health of the Evaluation Function prepared by TBS outlines the importance of calibrating evaluations, and in this context, to 'calibrate' is to determine the appropriate balance between the level of evaluation effort and the evaluation context (e.g., program's materiality, associated risks, etc.) (p. 36).

Return to footnote 4 referrer

Footnote 5

The program area was able to supply the names of their partners and cost-recovery clients, the names of the researchers who had access to the shared files at HC and PHAC, and the names of those who had requested access to RDCs to use CHMS data files. It is most likely that the names received are not an exhaustive representation of everyone that has been exposed to the data.

Return to footnote 5 referrer

Footnote 6

Tremblay, MS. The need for directly measured health data in Canada. Canadian Journal of Public Health 2004; 95: 165-8.

Return to footnote 6 referrer

Footnote 7

Connor Gorber S, Tremblay MS, Moher D, et al. A comparison of direct versus self-report measures for assessing height, weight and body mass index: a systematic review. Obesity Reviews 2007; 8: 307-26.

Return to footnote 7 referrer

Footnote 8

Statistics Canada. Canadian Health Measures Survey: Rational, background and overview, Health Reports, Background Papers on the Canadian Health Measures Survey, Special Issue, supplement to Volume 18, Dec 2007.

Return to footnote 8 referrer

Footnote 9

www.cdpac.ca

Return to footnote 9 referrer

Footnote 10

www.hcsc.gc.ca/english/care/romanow/hcc0086.html

Return to footnote 10 referrer

Footnote 11

Statistics Canada. "Canadian Health Measures Survey: Rational, background and overview" Health Reports: Background Papers on the Canadian Health Measures Survey, Special Issue, supplement to Volume 18, Dec 2007. Statistics Canada Catalogue no. 82-003-XIE2007101

Return to footnote 11 referrer

Footnote 12

Centers for Disease Control, National Center for Health Statistics. National Health and Nutrition Examination Survey. Available at: www.cdc.gov/nchs/nhanes.htm. Accessed January 1, 2007.

Return to footnote 12 referrer

Footnote 13

The NHANES program began in the early 1960s and has been conducted as a series of surveys focusing on different population groups or health topics. In 1999, the survey became a continuous program that has a changing focus on a variety of health and nutrition measurements to meet emerging needs. The survey examines a nationally representative sample of about 5,000 people each year. (Source: www.cdc.gov/nchs/nhanes.)

Return to footnote 13 referrer

Footnote 14

Kuczmarski RJ, Ogden CL, Guo SS, et al. 2000 CDC growth charts for the United States: methods and development. Vital Health Statistics 2002; 11(246): 1-190.

Return to footnote 14 referrer

Footnote 15

National Center for Health Statistics. National Health and Nutrition Examination Survey Data Accomplishments. Available at www.cdc.gov/nchs/about/major/nhanes/DataAccomp.htm Accessed June 1, 2007.

Return to footnote 15 referrer

Footnote 16

Dunstan DW, Zimmet PZ, Welborn TA, et al. The rising prevalence of diabetes and impaired glucose tolerance. Diabetes Care 2002; 25: 829-34.

Return to footnote 16 referrer

Footnote 17

Aromaa A, Koskinen S, Huttunen J. Health in Finland. KTL– National Public Health Institute. Ministry of Social Affairs and Health. Helsinki, Finland: Edita Ldt., 1999.

Return to footnote 17 referrer

Footnote 18

Aromaa A, Koskinen S. (eds.). Health and Functional Capacity in Finland. Baseline Results of the Health 2000 Health Examination Survey. Helsinki, Finland: KTL – National Public Health Institute, 2004.

Return to footnote 18 referrer

Footnote 19

Statistics Canada. Canadian Health Measures Survey: Rational, background and overview, Health Report, Background Papers on the Canadian Health Measures Survey, Special Issue, supplement to Volume 18, Dec 2007.

Return to footnote 19 referrer

Footnote 20

These requirements were compiled based on interviewees' preferences and suggestions. However, it is known that some of them are already in place (e.g., the capacity of record linkages), while others are out of scope for a national representative survey such as CHMS.

Return to footnote 20 referrer

Footnote 21

Medical doctors and health professionals use various guidelines for interpreting conditions observed in patients and for determining appropriate treatments. CHMS data are used as Canada-representative reference data for creating and updating some of these guidelines (e.g., the blood lead interpretation guide and the guidelines related to folic acid intake in women in child-bearing age). In the absence of Canadian, nationally representative health reference data, such guidelines have relied on data or practices from other countries, such as the United States.

Return to footnote 21 referrer

Footnote 22

When studying a potential health condition or issue, researchers must compare their research interest with the 'normal' state (for example, to confirm the significance of an issue or to assess the effect of an intervention). According to researchers, CHMS data provide a quintessential, Canada-representative control group for such studies, thus, providing "an incredible service to researchers in so many areas."

Return to footnote 22 referrer

Footnote 23

As part of CHMS data collection, samples of blood and urine are collected from respondents and stored in a Biobank. The Biobank is hosted at the National Microbiology Laboratory in Winnipeg. There are strict guidelines and protocols in place, compliant with the Statistics Act and the regulations of the Research Ethics Board of HC to ensure confidentiality, security and appropriate access to the samples. Access to the Biobank became available in June 2012 and is regulated by the Biobank Committee.

Return to footnote 23 referrer

Footnote 24

Biomonitoring measures include environmental exposure measures, but also biomarkers from seven other themes related to health.

Return to footnote 24 referrer

Footnote 25

Details already presented in Table 5.

Return to footnote 25 referrer

Footnote 26

These concerns were not limited to CHMS, but apply to all surveys requiring the use of an RDC.

Return to footnote 26 referrer

Footnote 27

Some challenges were faced when assessing the efficiency and economy of the CHMS program. The practice of compiling financial data by fiscal year rather than by cycle, the lack of baseline data on cost and resources used (as a result of the novelty and uniqueness of CHMS in Canada) and the lack of time and evaluation resources to perform any international comparisons with similar surveys were some of the factors that impacted the completeness and the conclusiveness of the evaluation evidence.

Return to footnote 27 referrer

Archived – Audit of Data-Sharing Agreements

Final audit report
Audit of Data-Sharing Agreements

Statistics Canada
April 15, 2010
Project Number: 80590-60
( Document (PDF, 195.58 KB) )

  • Executive summary
  • Introduction
    • Background
    • Objectives
    • Scope and Approach
    • Authority
  • Findings, Recommendations and Management Responses
    • DSA Confidentiality Compliance Environment
    • DSA Risk Assessment
    • DSA Information Management and Communication
    • DSA Confidentiality Compliance Status
  • Appendicies
    • Appendix A: Audit Criteria
    • Appendix B: Cumulative Number of Active Formalized Statistics Canada Data-Sharing Agreements at the end of 2008
    • Appendix C: Glossary

Executive summary

Data-sharing agreements (DSAs) under sections 11 and 12 of the Statistics Act have been practiced by Statistics Canada since 1976. DSAs have become a key business process. These agreements now number 500, cover nearly all of the business surveys and a majority of household surveys, and enjoy certain exceptions regarding the release of confidential respondent information. In recent years, data-sharing have become a growing and increasingly complex area to manage. Ensuring confidentiality protection of shared data, a key value of Citizen-Focused Service, Public Service Values and Stewardship at Statistics Canada, is a challenge. DSAs are covered by a multi-party management framework, characterized by a distributed management under various responsibility arrangements between the units of Statistics Canada and DSA partners (external Canadian organizations). The risks of non-compliance to the legislative and policy requirements on confidentiality protection and the damage to reputation of Statistics Canada were ranked as high.

The objectives of this audit were to provide the Chief Statistician and the Departmental Audit Committee (DAC) with assurance that Statistics Canada's DSA Confidentiality Management Control Framework (MCF) is adequate and effective over the entire life-cycle of DSAs; and activities supporting the DSA Confidentiality MCF are compliant with the Government of Canada and Statistics Canada acts and policies on confidentiality over the entire life-cycle of DSAs.

The audit was conducted by the Internal Audit Services of Statistics Canada and the evidence was gathered in compliance with the Internal Audit Standards for the Government of Canada and the International Professional Practices Framework (IPPF) of the Institute of Internal Auditors.

The audit found that Statistics Canada and its partners are compliant with the relevant acts and policies. However, opportunities exist to strengthen the management control framework related to DSAs in the areas of risk management, monitoring and information management and communication.

DSA Confidentiality Management Control Framework is composed of 5 elements: planning and reporting, information management and communication, risk assessment, monitoring and the confidentiality compliance environment. DSA confidentiality compliance environment as an element of the management control framework is satisfactory. The audit determined that there is no single comprehensive document or policy on the management of DSAs that would combine all relevant confidentiality compliance requirements, covering the entire DSA life-cycle and establishing an appropriate management control framework.

Systematic risk management is an underpinning of good government. The DSA management control framework has fragmented and often ad-hoc practices related to risk management when assessing confidentiality compliance. Opportunities to advance the risk management practices exist at the departmental and divisional level.

Access to information is a key value of Statistics Canada, which includes access to shared data by DSA partners under the data-sharing agreements. The audit found dormant DSAs and instances when shared data were not being provided to partners in a timely manner. Information management and communication practices would benefit from the improved integration of records and development of integrated protocols.

With respect to the DSA confidentiality compliance status, Statistics Canada and DSA partners are compliant with the legislative and policy requirements, no confidentiality breaches have been detected. However, the information required to assess confidentiality compliance was often fragmented and incomplete at the monitoring stage of the DSA life-cycle. A management monitoring regime would provide sufficient and reliable information for decision-making as it relates to relevance and confidentiality management.

Overall, there is an opportunity to advance to a strategic model of risk management and apply the principles of active monitoring with regards to DSAs, which would improve management effectiveness.

Introduction

Background

For more than 30 years Statistics Canada had exercised its mandate to enter into statistical data-sharing agreements (DSAs) with other organizations under the authority of sections 11 and 12 of the Statistics Act. DSAs have become a key business process. These agreements now number 500, cover nearly all of the business surveys and a majority of household surveys, and enjoy certain exceptions regarding the release of confidential respondent information either with or without the respondent consent, provided that the legal requirements for the provision of data-sharing information, consent rights and confidentiality protection are respected by all parties. In general, data-sharing for statistical purposes occurs when statistical and information inquiry is initiated by joint survey partners, or where a common data resource is equally and jointly owned by two or more partners. Data-sharing is exercised when there are significant reductions in response burden and compliance costs for data-sharing partners, as well as improvements in statistical data accuracy, coverage, relevance and timeliness.

Specifically, the Statistics Act allows for two types of DSAsFootnote 1 :

  • s.11 DSAs: data-sharing with provincial/territorial statistical offices that are subject to legislation similar to the federal Statistics Act, which provides the authority to collect information for statistical purposes and to compel response from respondents and to request mandatory data-sharing; it stipulates legal requirements to ensure confidentiality protection of the respondent information and to notify respondents of the planned data-sharing;
  • s.12 DSAs: data-sharing with other federal government departments, non-statistical provincial government departments, municipal corporations and other legal entities, which either have (according to their own legislation) or do not have the legal authority to compel response and to request mandatory versus voluntary data-sharing; it stipulates legal requirements to ensure confidentiality protection of the respondent information, to notify respondents of the planned data-sharing, and, in case of the voluntary data-sharing, to inform respondents about their right to object to data-sharing.

In 2008, the volume of statistical data-sharing agreements has reached the 500 mark. Most of the DSAs, or 94% of them, are classified as providing for voluntary data-sharing. These fall under s.12 DSAs, where respondents have the right to refuse to share the information. The rest of the agreements are split between the mandatory data-sharing under s.11 provincial/territorial DSAs (4%) and mandatory data-sharing under so-called s.12+ DSAs (2%)Footnote 2 . The accumulation of DSAs is a reflection of the need for cooperation between Canadian organizations in the collection, compilation and publication of the statistical information.

At the same time, the federal privacy and security control environment has tightened and became more complex when the Privacy Act (1985), TBS Policy on Privacy Impact Assessment (2002) and Government Security Policy (2002) came into effect. In response, Statistics Canada has established broad control mechanismsFootnote 3 for the confidentiality protection of the respondent data, including those obtained under the DSAs.

The DSA Confidentiality Management Control Framework (DSA Confidentiality MCF) is defined as a way in which Statistics Canada and DSA partners organize themselves in order to distribute, coordinate, and manage confidentiality risks associated with the data-sharing processes and to ensure compliance with the relevant acts and policies. There are three major groups of legislative and policy requirements for DSAs that confidentiality management control framework covers: 1) the general DSA information (ISR) and consent rights management; 2) the general DSA confidentiality protection management (i.e. physical, IT and personnel security); and 3) the DSA-specific confidentiality safeguards (in this case, third-party data-sharing or sharing with other parties).

DSA Confidentiality MCF for Statistics Canada is characterized by a distributed management, with separate mandates and various responsibility arrangements among the following key parties:

  • STC departmental unit: Data Access and Control Services Division (DACS) in the consulting and legal verification role;
  • STC divisions: survey-managing divisions that are responsible for the implementation and operations of associated DSAs (this includes oversight of collection areas and collection partners);
  • DSA partners: federal, provincial/territorial or municipal governments and other Canadian legal entities.

Objectives

The objectives of this audit were to provide the Chief Statistician and the Departmental Audit Committee (DAC) with assurance that:

  1. Statistics Canada's DSA Confidentiality MCF is adequate and effective over the entire life-cycle of DSAs.
  2. Activities supporting the DSA Confidentiality MCF are compliant with the Government of Canada and Statistics Canada acts and policies on confidentiality over the entire life-cycle of DSAs.

Scope and Approach

The audit engagement was conducted in conformity with the Internal Audit Standards for the Government of Canada and the International Professional Practices Framework (IPPF) of the Institute of Internal Auditors. All work was conducted in collaboration with DACS, Statistics Canada's divisions and DSA partner managers responsible for the DSAs selected in the audit sample. The audit approach was inspired by the Government of Canada Management Accountability Framework (MAF) and the Core Management Control Guidelines issued by the Office of the Comptroller General (audit criteria, Appendix A).

The audit universe consisted of 500 active and formalized DSAs, pertaining to s.11, s.12 and s.12 + DSAs for the period of 1976-2008 (see Appendix B). The scope of the audit included:

  • the assessment of Statistics Canada multi-party DSA Confidentiality Management Control Framework established for a system of DSAs, covering the period of October 1976 to October 2008 and all DSAs; and
  • the conduct of tests of compliance controls for the selected active formalized DSAs: a sample of 39 DSAs, of which 31 agreements were s.12 DSAs (for the period of October 2006-08) and 8 s.12+ DSAs (all of them); the sample covered 80 business and social surveys, 10 DSA managing divisions at Statistics Canada and 32 DSA partnersFootnote 4.

Assessing the DSA confidentiality management control framework involved comprehensive examination of multi-party DSA confidentiality compliance practices along such dimensions as DSA confidentiality compliance environment, risk assessment, planning and reporting, information and communication, and monitoring with respect to the three groups of legislative and policy requirements (see Appendix C). To perform the audit work, the following methods were used: audit interviews with Statistics Canada management, audit surveys, examinations of controls and compliance tests.

Excluded from the scope of tests of compliance controls were s.12 DSAs formalized prior to October 2006 due to significant management changes, i.e. introduction of enhanced confidentiality controls and review and audit clauses by Statistics Canada; and all s.11 DSAs due to their relatively lower risk levelFootnote 5.

Authority

The audit was undertaken by the Internal Audit Services in accordance with the Statistics Canada's Risk-Based Audit Plan for the fiscal years of 2008/09-2010/11 which was approved by the Internal Audit Committee on March 19th, 2008.

Findings, Recommendations and Management Responses

An adequate and effective management control framework for DSA confidentiality compliance, in relation to three groups of legislative and policy requirements, would include planning and reporting, information management and communication, risk assessment, monitoring and compliance environment.

In relation to objective 1, out of the five MCF dimensions, only the DSA confidentiality compliance environment is fully managed (see Figure 1). The rest of the MCF elements are at various stages of development, with particular weaknesses in risk assessment, information management and communication, and monitoring, requiring a range of improvements.

DSA confidentiality compliance environment as an element of the MCF is satisfactory. However, the audit determined that there is no single comprehensive document or policy on the management of DSAs that would combine all relevant confidentiality compliance requirements, covering the entire DSA life-cycle and establishing an appropriate MCF.

Systematic risk management is an underpinning of good government. The DSA management control framework has fragmented and often ad-hoc practices related to risk management when assessing confidentiality compliance.

Access to information is a key value of Statistics Canada, which includes access to shared data by DSA partners under the data-sharing agreements. The audit found dormant DSAs and instances when shared data were not being provided to partners in a timely manner. Information management and communication practices would benefit from improved integration of the records and development of integrated and standardized protocols.

In relation to objective 2, Statistics Canada and DSA partners are compliant with the legislative and policy requirements on confidentiality protection; however, the information required to assess compliance was often fragmented and incomplete at the monitoring stage of the DSA life-cycle.

All recommendations and management response and action plans (MRAPs) should be considered within the existing Statistics Canada management structure.

 

DSA Confidentiality Compliance Environment

DSA confidentiality compliance environment element of the MCF is satisfactory, and is characterized by a distributed management between Statistics Canada and DSA partners. There is no single document or policy on the management of DSAs that would combine all relevant confidentiality compliance requirements and establish a strong management control framework covering the entire DSA life-cycle for all parties involved.

Adequate and effective management of DSAs would include confidentiality compliance controls, governance structures, accountability and responsibility mechanisms, training and operational processes over their entire life-cycle. The audit identified that DSA confidentiality compliance environment is defined, communicated and managed between DSA parties, especially at the design and negotiation stage of the agreements. However, the environment is complex and difficult to navigate, and characterized by an absence of an integrated policy framework to manage DSAs.

DSA confidentiality compliance requirements are defined by the means of preventative controls found in the Statistics Act, Privacy Act, STC Policy on Privacy Impact Assessments (PIA), STC Policy on Informing Survey Respondents (ISR), STC Policy on Security of Sensitive Statistical Information, STC IT Security Policy, STC Policy on Micro-Data Release, STC Policy on Discretionary Disclosure and associated guidelines. Provisions for DSAs in legislative and policy references are quite fragmented and difficult to piece together for those who are not dealing with them on the daily basis. Specific standards for the confidentiality compliance are defined by the texts of DSAs and associated security appendices. This is further complicated by the fact that DSAs often combine multiple requirements from various jurisdictions which are subject to change, often making standardization difficult.

Strategic DSA governance and oversight framework is established and coordinated among DSA parties. Statistics Canada has a Confidentiality and Legislation CommitteeFootnote 6 that accommodates hearings for DSA concerns and proposals from divisions, DACS and DSA partners resulting in decisions and action plans. Other internal management committees, such as the Policy Committee, also get involved when necessary. DSA partners can be significantly involved during the implementation stages of DSAs via the Steering and Advisory Committees or Technical Groups associated with surveys that are covered by DSAs. This process allows all parties to exchange information about the DSA confidentiality compliance issues.

The operational authorities and responsibilities for DSA confidentiality compliance are distributed among the multiple parties. These responsibilities are limited by the mandates of these parties and are not necessarily carried through the entire life-cycle of a DSA. The departmental function of Data Access and Control Services is dedicated to the legal and policy review, negotiation and approval of DSAs, their modification and termination, as well as coordination of associated requests in between these stages from either subject-matter divisions or DSA partners. DACS has a limited mandate to ensure accountability for DSA confidentiality compliance during implementation and monitoring stages, aside from provision of services to both divisions and DSA partners. Survey and DSA managing divisions consult with DACS during the planning stage and the implementation of DSAs using a set of internal support partners in the fields of collection services, communications, IT, or joint collection with external DSA partners. The joint management of DSAs between two or more Statistics Canada divisions is also practiced depending on the complexity of the DSAs, for which there are no clear guidelines on the roles and responsibilities. DSA partners are responsible for ensuring compliance with the terms of the DSAs during the implementation stages of the agreements.

It was found that DSA confidentiality compliance control processes are incorporated into much larger control mechanisms of survey operations. Compliance of DSAs with general information and consent rights requirements is embedded into the Survey Prescription processes. General confidentiality protection requirements for DSAs are incorporated into generic requirements on the physical, IT and personnel security, for which security checklists and procedures exist. DSA-specific safeguards, such as prohibition of third-party data-sharing or allowance for restricted access for researchers and research organizations of DSA partners under certain conditions, are specified in the agreements. Thus, managers have to combine all of these fragments into their own rules and processes, resulting in rather diverse practices for DSA confidentiality compliance management at the divisional level. Further, 90% of the divisions do not have dedicated managers for DSAs. Rather, responsibility for operational DSA management falls on the shoulders of survey managers. 80% of divisions do not have written manager's guides. There are no integrated operational protocols or manuals dedicated to DSAs in all 5 dimensions of MCF.

To mitigate the complexity of this environment and to ensure compliance, DACS provides extensive training programs in all generic areas related to confidentiality. This training covers the basics of legislative and policy compliance requirements, with some references to DSAs. However, it is not specific and operational enough for managers to be fully confident in its application. This results in DACS being overwhelmed by the constant requests for advice, clarifications, reviews, formal letters, custom training sessions, etc.

The absence of an integrated DSA management policy or framework adds to the complexity of the task of managing DSA confidentiality compliance and increases the risk of misinterpretation and confusion, potentially resulting in breaches of confidentiality. This may also result in heterogeneous and erroneous applications of DSA confidentiality compliance requirements. Having one integrated policy will provide greater overall clarity for the DSA confidentiality compliance environment.

Recommendation #1

It is recommended that Assistant Chief Statistician Corporate Services ensure that Data Access and Control Services develop a comprehensive and integrated Policy on the Management of Data-Sharing Agreements to provide adequate control coverage over the entire DSA life-cycle.

Management Response

Management accepts the recommendation.

DACS will develop a DSA governance process guided by a policy or directive, which will be integrated with the risk management framework. In addition, divisions will be asked to report on this element as part of the new Statistics Canada Quadrennial Program Review (QPR) guidelines.

Deliverables and timelines:

  • Presentation of the governance process to the Confidentiality and Legislation Committee. Integration of this element into the Quadrennial Program Review (QPR) guidelines.
  • Director, Data Access and Control Services and Director, Corporate Planning and Evaluation Division – October 2010

DSA Risk Assessment

DSA risk management practices are in the early development stage and its risk assessment is fragmented, ad-hoc and managed informally.

An appropriate risk management model would include adequate and effective practices for assessment of the risks of non-compliance and/or non-reporting for breaches and weaknesses related to DSA confidentiality. The audit revealed that systematic and formal mechanisms for the DSA risk assessment are not in place, but are rather fragmented, ad-hoc and managed informally.

At the departmental level (DACS), it was found that the risks of non-compliance for the general DSA information and consent rights management are assessed at the DSA design & negotiation stage by generic or specific Privacy Impact Assessments. The risks of non-compliance for the other two groups of requirements, i.e. general DSA confidentiality protection and DSA-specific confidentiality safeguards are not formally assessed by DACS due to the newness of the risk management initiative at Statistics Canada. At the divisional level, due to limitations in the mandate of DSA-managing divisions, half of the directors assessed their DSA risk assessment processes as managed informally and unsystematically, while the other half admitted that there is no activity in this area. However, all confirmed that the procedures for reporting on the breaches and weaknesses of the confidentiality controls are known and implemented when breaches are reported. It is expected that employees making mistakes will come forward or will be detected. At the DSA-partners level, the evidence is not sufficient to provide a conclusion.

There is a risk that the lack of formal, integrated and continuous risk management practices for DSA confidentiality and non-reporting risks across all DSA parties will not detect confidentiality breaches and may prevent adequate and effective risk mitigation strategies.

An innovative practice exists in Health Statistics Division, which controls the risks that respondent non-sharers can be mistakenly identified as sharers during the collection period by monitoring the collection processes for various surveys. It also maintains a spreadsheet file to identify risks related to DSAs.

Recommendation #2

It is recommended that Assistant Chief Statistician Corporate Services ensure that Data Access and Control Services, in consultation with Corporate Planning and Evaluation, strengthen the risk management practices with respect to DSAs.

Management Response

Management accepts the recommendation.
DACS will conduct a threat and risk assessment with respect to the management of DSAs. This will result in a report outlining the potential threats, and the steps that could be taken to eliminate or reduce the risks.
The process will be repeated regularly.
The report will be presented to the Confidentiality and Legislation Committee.

Deliverables and timelines:

  • Presentation of the report to the Confidentiality and Legislation Committee.
  • Director, Data Access and Control Services and Director, Corporate Planning and Evaluation Division – October 2010

DSA Information Management and Communication

DSA information management and communication practices as an element of MCF are defined, but would benefit from further integration and modernization.

An appropriate information management and communication model would include adequate and effective systems, processes and protocols to form comprehensive DSA records and to gather relevant statistics on DSA confidentiality compliance performance to facilitate the senior management decision-making process. Data-sharing is exercised when there are significant reductions in response burden and compliance costs for data-sharing partners, as well as improvements in statistical data accuracy, coverage, relevance and timeliness. The audit identified that DSA information management and communication systems and processes are disconnected between DACS and DSA-managing divisions, and are managed according to their mandates. This practice results in heterogeneous records and gaps in information and communication coverage during the DSA life-cycle. Knowledge management systems and processes are predominantly informal.

At the departmental level, it was found that information is spread over several media: paper files, DACS Administrative Database, and server files. Combined, these sources provide information for decision making, but require integration. Indicators found in the DSA database are useful, but are applicable mostly during the beginning and end of the DSA life-cycle, when DACS controls legal processes. However, rather limited information is available in the middle of the DSA life-cycle. There are no provisions in the systems and processes for the collection and analysis of information on DSA confidentiality compliance.

At the divisional level, the audit revealed that DSA managing divisions structure their own operational information and communication processes. It was found that they have not established systematic and integrated information management, communication and record systems and processes, but rather have what they consider as essential records to reflect their transactions with DSA partners, providing incomplete information. Divisional managers over-rely on DACS for information, believing they can always have access to DSA legal files and documentation on request. 60% of the directors see the need for a central DSA electronic database to enable pro-active management and monitoring of the DSAs.

The audit identified that there is a communication issue between DSA parties regarding the timing of the release of the shared data. DSA-managing divisions make an assumption that DSA partners will request the shared data file upon announcement of the release in the external flagship publication of Statistics Canada, the "Daily". However, this was not the case. Additionally, the majority of the DSA partners had difficulty providing information in a timely manner during the audit procedures.

There is a risk that a sub-optimal DSA information management and communication model will not provide a timely, accurate, continuous and holistic "big picture" of DSAs to aid an effective decision-making process.

Recommendation #3

It is recommended that Assistant Chief Statistician Corporate Services ensure that Data Access and Control Services strengthen the information management and communication practices for DSAs.

Management Response

Management accepts the recommendation.
DACS will develop a departmental directive to formally describe the required practices.
The draft directive will be presented to the Confidentiality and Legislation Committee.

Deliverable and timeline:

  • Presentation of the draft directive to the Confidentiality and Legislation Committee.
  • Director, Data Access and Control Services – October 2010

DSA Confidentiality Compliance Status

Statistics Canada and DSA partners are compliant with the legislative and policy requirements on confidentiality protection, but implementing a strong management monitoring regime would facilitate the management of all DSAs.

We expected to find compliance with the Government and Statistics Canada policy framework, the legislative requirements and the terms and conditions of the DSAs over their entire life-cycle. The audit identified that Statistics Canada and DSA partners are compliant with the legislative and policy requirements, but the information, which would normally be acquired through a monitoring program, is insufficient.

In 2006, DACS started to gradually introduce a review and audit clause into the texts of the DSAs, where the partners would agree to it. However, the capacity to exercise the clause on a systematic basis does not exist. The onus is on Statistics Canada DSA-managing divisions and DACS to verify that partners follow the rules. DSA confidentiality compliance monitoring, defined as practices encompassing evaluations of DSA confidentiality compliance controls, reporting and exchanging information on control weaknesses and their correction, as well as change management, is not envisaged by the current legislative and policy suite. A few years ago, DACS asked the s.11 DSA partners to conduct a self-assessment and submit their reports.

With respect to compliance with general DSA information and consent rights requirements, the audit identified that at the departmental level, the ISR and PIA processes for the verification of survey materials and DSA texts are well managed. At the divisional level, the audit indicated that improvement is required regarding monitoring of the DSA ISR and consent rights requirements during the survey implementation and collection of compliance performance information from the collection areas. A mechanism to collect respondent objections/waivers to data-sharing is in place for all of the divisions in the audit sample. However, only half of them stated that they have a documented procedure for processing these data, collecting reports from collection areas and using this information further in the preparation of shared files.

The audit found that requirements on general DSA physical, personnel, IT security and disclosure management are adequately specified in the texts of the agreements, and compliance by Statistics Canada staff and DSA partners is observed. The audit revealed that 34% of DSA partners reported that they have not established any processes yet, because they have not requested the data or that they are not planning to request the data.

With respect to compliance with DSA confidentiality safeguards regarding conditions for data-sharing with other parties, the audit found that prohibition of third-party data-sharing is specified in the texts of DSAs or clarified during the negotiations process. In exceptional circumstances, third-party data-sharing is allowed on the basis of legislative and security review by DACS. In the "Uses of Information" clause of the agreements, the research contractors and research organizations working directly for the DSA partners can be allowed access to shared data under very strict security and non-disclosure conditions. The audit identified that often research contractors and research organizations are engaged by the DSA partners once the shared data become available well into the implementation or monitoring stages of the DSAs. Regardless of the voluntary or mandatory nature of the DSAs, divisions have rather limited processes for coordination and monitoring of data-sharing with other parties. Managers know the provisions of the DSAs, however, systematic documentation regarding how these processes are managed was not available in all instances during the audit.

Monitoring practices are not systematic, have limited coverage and do not provide complete DSA information as it relates to confidentiality. The risk of insufficient monitoring may result in the failure to detect weaknesses and confidentiality breaches in relation to the DSA legislative and policy requirements. Consequently, timely and effective corrective measures cannot be implemented. The lack of a strong monitoring regime results in an accumulation of dormant DSAs, which would ideally be terminated or modified.

An innovative practice has been identified in Business Special Surveys and Technology Statistics Division (BSSTSD). A system is in place to track and monitor waivers for data-sharing. The written waivers are noted in the data-capture system (PC-BOSS) and the letters are stored in locked cabinets for reference. Information on the waivers is monitored through this system and queries are used to eliminate respondents who objected to data-sharing with specific partners. In addition, interviewers are physically monitored during collection process to ensure compliance to policy requirements on data-sharing.

Recommendation No. 4

It is recommended that the Assistant Chief Statistician Corporate Services ensure that Data Access and Control Services implement a DSA monitoring program.

Management Response

Management accepts the recommendation.
DACS will develop a proposal for a DSA monitoring program, including resource requirements. In addition to employee time to manage the monitoring, travel costs will be required.
The proposed monitoring program will be presented to the Confidentiality and Legislation Committee.

Deliverable and timeline:

  • Presentation of the monitoring plan to the Confidentiality and Legislation Committee.
  • Director, Data Access and Control Services – October 2010

Appendices

Appendix A: Audit Criteria

1.1. DSA multi-party confidentiality compliance environment is adequate and effective and satisfies MAF Citizen-Focused Service CFS-1, CFS-3, CFS-4 and Stewardship ST-22 criteria

1.2. DSA multi-party confidentiality compliance risk assessment practices are adequate and effective and satisfy MAF Citizen-Focused Service CFS-1, CFS-3, CFS-4 and Stewardship ST-22 criteria

1.3. DSA multi-party confidentiality compliance control planning & reporting practices are adequate and effective and satisfy MAF Citizen-Focused Service CFS-1, CFS-3, CFS-4 and Stewardship ST-22 criteria

1.4. DSA multi-party confidentiality compliance information & communications practices are adequate and effective and satisfy MAF Citizen-Focused Service CFS-1, CFS-3, CFS-4 and Stewardship ST-22 criteria

1.5. DSA multi-party confidentiality compliance monitoring practices are adequate and effective and satisfy MAF Citizen-Focused Service CFS-1, CFS-3, CFS-4 and Stewardship ST-22 criteria

2.1. General DSA information and consent rights management by all DSA parties over its life-cycle is compliant with relevant acts and policies and associated guidelines, MAF Public Service Values PSV 1-4 and Stewardship ST-22 criteria

2.2. General confidentiality protection management by all DSA parties over its life-cycle is compliant with Statistics Act, GoC Security Policy, StatCan's Policy on Security of Sensitive Statistical Information, IT Security Policy and associated guidelines, MAF Public Service Values PSV 1-4 and Stewardship ST-22

2.3. Management of DSA-specific confidentiality safeguards by all DSA parties over its life-cycle is conducted according to the terms of the agreement and MAF Public Service Values PSV 1-4 and Stewardship ST-22

Appendix B: Cumulative Number of Active Formalized Statistics Canada Data-Sharing Agreements at the end of 2008

Appendix B: Cumulative Number of Active Formalized Statistics Canada Data-Sharing Agreements at the end of 2008
Jurisdiction Section
11 12 12+
Federal 0 113 6
Newfoundland 2 23 0
Prince Edward Island 0 25 0
Nova Scotia 1 23 0
New Brunswick 1 23 0
Quebec 5 30 0
Ontario 1 27 0
Manitoba 7 29 0
Saskatchewan 1 23 2
Alberta 1 30 2
British Columbia 1 28 0
Yukon 1 13 0
Northwest Territories 0 13 0
Nunavut 0 4 0
Miscellaneous 0 65 0
Total 21 469 10

Appendix C: Glossary

DSA Confidentiality Management Control Framework (MCF) - A way in which Statistics Canada and DSA partners organize themselves in order to distribute, coordinate, and manage confidentiality risks associated with the data-sharing processes and to ensure compliance with the relevant acts and policies.

DSA Confidentiality MCF Architecture includes:

1st tier: Confidentiality compliance requirement groups

  • general DSA information and consent rights management
  • general DSA confidentiality protection management
  • DSA-specific confidentiality safeguards

2nd tier: Compliance framework

  • DSA multi-party confidentiality compliance environment;
  • DSA multi-party confidentiality compliance risk assessment practices;
  • DSA multi-party confidentiality compliance control planning & reporting practices;
  • DSA multi-party confidentiality compliance information & communications practices;
  • DSA multi-party confidentiality compliance monitoring, change management and corrective practices.

3rd tier: Corporate Control Framework groups

  • Acts, policies, guidelines, standards
  • Accountability & responsibility centres
  • Systems & processes

4th tier: DSA life-cycle stages

  • Design & negotiation
  • Implementation
  • Monitoring
  • Modification/termination

DSA Life-Cycle - Period from the start to the end of activities for the DSA project; divided into the 4 stages:

  1. DSA design & negotiation – the period from the start of communications between the DSA partners on the potential DSA project and the signing of the agreement.
  2. DSA implementation – conduct of the data collection and data transmission activities.
  3. DSA monitoring – monitoring of the compliance of the DSA partners to the terms of the agreement; monitoring of changes in the conditions, etc. It also means conduct of communications with the parties involved in the Management Control Framework, evaluations, inspections, assessments, reviews of the various controls, etc.
  4. DSA modification/termination – the period from the start of communications between DSA partners on the changes to the status of the agreement until it is revised or terminated.

Management Control - Any action taken by management and other parties to manage risk and increase the likelihood that established objectives and goals will be achieved. Management plans, organizes, and directs the performance of sufficient actions to provide reasonable assurance that objectives and goals will be achieved.

ISR – Informing Survey Respondents, Statistics Canada policy and associated processes and mechanisms.

PIA – Privacy Impact Assessment, Statistics Canada policy and associated processes and mechanisms.

Notes:

Footnote 1

"Data-sharing between Statistics Canada and other organizations: A primer", http://www44.statcan.ca/2008/11/s0400-eng.htm.

Return to footnote 1 referrer

Footnote 2

s.12+ refers to the amendment of s.12 of Statistics Act, which authorizes mandatory data sharing with other federal or provincial government departments and organizations who have the legal authority to compel response in addition to Statistics Canada's, and who use the data in accordance with their own governing legislation.

Return to footnote 2 referrer

Footnote 3

on Informing Survey Respondents and associated guidelines, Privacy Impact Assessment Policy and associated guidelines, Policy on Security of Sensitive Statistical Information and associated guidelines, IT Security Policy and associated guidelines, etc.

Return to footnote 3 referrer

Footnote 4

Sample adequately represents a range of federal government departments, provincial non-statistical government agencies and other organizations, and miscellaneous legal entities in Canada (including aboriginal organizations).

Return to footnote 4 referrer

Footnote 5

In addition, enhanced confidentiality protection compliance inspections of the provincial/territorial statistical focal points have been consistently conducted by Statistics Canada during the period of 2001-2008 for the purposes of CRA MOU.

Return to footnote 5 referrer

Footnote 6

Confidentiality and Legislation Committee reports to Policy Committee.

Return to footnote 6 referrer

Archived – Audit of Research Data Centres Program

Audit Report
Audit of Research Data Centres Program

Statistics Canada
September 30, 2010
Project Number: 80590-61
( Document (PDF, 119.78 KB) )

Executive summary

The Research Data Centres (RDC) are part of an initiative by Statistics Canada, the Social Sciences and Humanities Research Council (SSHRC) and university consortia to help strengthen Canada's social research capacity and to support the policy research community. RDCs provide researchers with access, in a secure setting, to microdata from population and household surveys. The Centres are governed by a board of directors composed of representatives from Statistics Canada, SSHRC, Canadian Institute for Health Research and the academic directors of each centre. The centres are staffed by Statistics Canada employees.

The objectives of the audit of RDCs are to provide the Chief Statistician and the Departmental Audit Committee with assurance that activities supporting the governance framework are adequate and effective to ensure that services are delivered to researchers; and that access to data follows Statistic Canada policies and procedures. The audit was conducted by Internal Audit Services in accordance with the Government of Canada's Policy on Internal Audit.

Key Findings

Activities supporting the RDC network governance are adequate and effective in ensuring continued delivery of services. The corporate management structure is effective, results are measured and reported on; however, performance is not measured against program operational targets and results are not included in the Departmental Performance Report. Once research proposals are approved by Statistics Canada, the scope of the initiative is managed by stakeholders. Since inception, there have been challenges in maintaining program costs within granted funding limits as the number of centres rose from 9 to 24. As a result, increasing deficits have been incurred within the RDC Program. Through a newly increased annual budget, Statistics Canada has stabilized the costs for this program and established a costing structure to ensure RDCs maintain operational costs within the scope of their budgets.

The RDC program and RDC sites do not have formal risk management practices and risks associated to the continuity of the operations of RDCs have not been assessed as part of overall risk management practices for the program.

Researchers conduct their work in restricted physical and technological infrastructures that are secure, which reduces the risk of confidentiality breaches. Information handled in RDCs is subject to governmental confidentiality and security requirements; however, RDCs have not been inspected periodically for compliance to security requirements following initial inspections. After conducting a number of security checks on site, some deviations from security requirements were observed. Security training is provided to new analysts but there was no evidence of continued awareness programs. Finally the disclosure process in place for the program is applied rigorously and consistently, and no unwanted disclosures were observed.

Conclusion

Overall, the activities supporting the Governance Framework surrounding the Research Data Centres Program are adequate and effective, ensuring that services are delivered to researchers. Nevertheless, we noted opportunities to improve the management control framework as it relates to reporting results and assessing risks, as well as opportunities to ensure that departmental security requirements are met.

Introduction

Background

In the late 1990s Statistics Canada and the Social Sciences and Humanities Research Council (SSHRC) commissioned a task force to examine the state of the quantitative social sciences in Canada. Concern was expressed about the future of Canada's capacity to fruitfully exploit the rich sources of quantitative data on households and individuals to inform public policy and public debate. The joint task force published its findings in 1998 in a report titled "The Final Report of the Joint Working group of the Social Sciences and Humanities Research Council and Statistics Canada on the Advancement of Research using Social Statistics" (Statistics Canada 1998). The report outlines a series of recommendations aimed at building social science research capacity in Canada, improving access to Statistics Canada data in order to support research activity and communicating the research findings.

These recommendations led to the establishment of a Canadian Research Data Centres Network (CRDCN) and the RDC Program within Statistics Canada as the organisational unit that represents the agency within the Network. The first 9 Research Data Centres started offering access to data in secure university-based laboratories across Canada in 2001. A second CFI award (2006), and the joint SSHRC/CIHR 2005-2010 operating grant has supported the Network's rapid expansion, and allowed it to make significant progress. Data access was improved by giving researchers across the country access within Universities to detailed microdata, initially to Statistics Canada longitudinal surveys, and now to a broader range of data sets.

RDCs provide researchers with access, in a secure setting, to microdata from population and household surveys. Statistics Canada employees oversee operations carried out in the centres. They operate under the provisions of the Statistics Act in accordance with all the confidentiality rules and are accessible only by researchers with approved projects who have been sworn in under the Statistics Act as 'deemed employees.' RDCs are located throughout the country, so researchers do not need to travel to Ottawa to access Statistics Canada microdata.

The research data centres provide opportunities to: generate a wide perspective on Canada's social landscape; provide social science research facilities across the country in both larger and smaller population centres; expand the collaboration between Statistics Canada, SSHRC, CIHR, CFI, universities and academic researchers; build on the Data Liberation Initiative and train a new generation of Canadian quantitative social scientists.

The network has grown rapidly since its creation. The number of sites has risen from 9 sites in 2001 to 24 sites in 2010, and the number of projects initiated and completed has risen dramatically. Over 2,600 researchers from a multitude of disciplines and institutions have worked in a growing number of RDCs on over 1500 projects, using an expanding range of micro-data files to examine many health and socioeconomic issues. Statistics Canada is responsible for the protection of the data, confidentiality vetting and researcher support in centres. There are 53 Statistics Canada analysts and statistical assistants in the Centres, and 5 regional managers dedicated to this program across the country. The program is supported by a Head Office Operations Unit with 3 employees.

Authority

The audit was part of the Multi-Year Risk-Based Audit Plan 2008/09-2010/11 and was approved on March 19, 2008 by the Departmental Audit Committee.

Audit Objectives

The objectives of the audit of RDC are to provide the Chief Statistician and the Departmental Audit Committee with assurance that:

  • activities supporting the Governance Framework are adequate and effective to ensure that services are delivered to researchers;
  • access to data follows Statistic Canada policies and procedures.

Scope and Approach

The scope of the audit was to assess the effectiveness and adequacy of the activities supporting the current governance structure, as well as the compliance to key Statistics Canada policies and procedures relating to security and confidentiality. Relevant policies are: the Security Practices Manual which relates to the departmental security policy, the IT Security Policy, the Record Linkage Policy, the Security of Sensitive Statistical Information Policy, the Discretionary Disclosure Policy and the Statistics Act. The audit was conducted in conformity with the Treasury Board and the Institute of Internal Auditors standards.

The approach consisted of assessing the processes and procedures of the governance framework in place to control access to data. This was achieved through interviews with key departmental staff and external stakeholders managing the university side of the partnership program, detailed testing of processes and procedures and review of relevant documentation.

The RDC program has 24 centres located across Canada. The examination phase included physical inspections of five RDC sites: the Carleton, Ottawa, Outaouais, Local (COOL) Centre at the University of Ottawa, the Quebec Inter-University Centre for Social Statistics (QICSS) at the University of Montreal, the University of Toronto RDC, the University of British Columbia RDC and the Simon Fraser University RDC in British Columbia.

The examination phase for this audit was conducted from February to May, 2010.

Findings, Recommendations and Management Responses

Governance and Strategic Direction

Overall, activities supporting the RDC network governance are adequate and effective in ensuring continued delivery of services. The corporate management structure is effective, and results are measured and reported on. Nevertheless, performance is not measured against operational program targets and results are not included in the Departmental Performance Report. Once research proposals are approved by Statistics Canada, the scope of the initiative is managed by stakeholders. Since inception there have been challenges in maintaining program costs within granted funding limits as the number of centres rose from 9 to 24. As a result, increasing deficits have been incurred within the RDC Program. Through a newly increased annual budget, Statistics Canada has stabilized the costs for this Program and established a costing structure to ensure RDCs maintain operational costs within the scope of their budgets.

An established governance mechanism providing adequate strategic direction would include a clear committee structure to ensure the effectiveness of relationships and escalation of management issues. The audit found that a clear committee structure is established and is effective. The Canadian Research Data Centres Network (CRDCN) is a partnership consisting of participating universities, Statistics Canada (through the RDC Program funding) and two major granting councils: the Social Sciences and Humanities Research Council (SSHRC) and the Canadian Institutes for Health Research (CIHR).

The CRDCN and Statistics Canada have distinct reporting structures and both of them work in a joint partnership (see Appendix B). The CRDCN committee is the main governing body for the Network. It negotiates grants with the major councils and other funding bodies and sets the policies that determine the membership of the Network, the distribution of grant allocations, the strategic directions for the Network while respecting Statistics Canada criteria for confidentiality, and the dissemination of results of research conducted in the centres and the promotion of the Network nationally and internationally. The CRDCN meets twice a year and is composed of representatives of Statistics Canada, Universities, and partners (SSHRC, CIHR). During these meetings, items such as the funding allocation formula, new IT technology initiatives, training for students, RDC conferences, and member voting rights are discussed. The allocation formula is used by the CRDCN to determine grant allocation to RDCs, based on RDC activity workload, measured by the number of contracted research projects underway or suspended, and status on outputs.

In order to increase the efficiency of the decision making process at meetings, the CRDCN went through a re-organisation and implemented a sub-committee structure. The following sub-committees have been created: the Executive Committee, Implementation of CFI Award, Coordinate Information Gathering and Dissemination, Thresholds and Measures for Allocation. The CRDCN Executive Committee was created to address strategic decisions involving the university network. This group meets twice a year prior to the CRDCN Committee bi-annual meetings. Representatives of Statistics Canada attend these meetings. The CRDCN sub-committees were created as a supporting role to the CRDCN Committee. These sub-committees provide subject-matter expertise on research related questions for consideration and decision making at the CRDCN Committee meetings. Statistics Canada representatives often attend these meetings and provide workload information used to set the funding allocation formula.

The RDC Program structure within Statistics Canada consists of the lead program manager and five regional RDC managers. Meetings are held on a weekly basis to discuss day-to-day operations of the RDC Program and ensure that processes and procedures are applied consistently across the program. Members also meet as a working group to discuss and resolve operational issues that may arise. The operations of the centres are managed by Statistics Canada employees in partnership with academic directors from the universities. Universities provide IT support. One representative of the research community in each university holds the function of Academic Director. The Statistics Canada analysts report to the regional RDC managers, who are located in Statistics Canada regional offices. Regional RDC managers are responsible for the management of several RDCs and report to the Assistant Director, Microdata Access Division. Because managers are not on site, management of the centres is done through e-mails, phone conversations and twice-yearly visits to centres.

To access the microdata housed in the Research Data Centres (RDCs), researchers submit a project proposal to the Social Sciences and Humanities Research Council (SSHRC) for peer review and internal review by Statistics Canada. SSHRC invites applications from individual researchers or from research teams led by a principal applicant. The principal applicant is responsible for submitting application forms on behalf of the team. Each proposal is evaluated by two academic peers and a Statistics Canada analyst. The SSHRC facilitates the peer review process and the head office operations unit facilitates the internal Statistics Canada review. The proposals are assessed based on: scientific merit and viability of the proposed research; relevance of the methods to be applied; demonstrated need for access to detailed microdata; and expertise and ability of the researchers to carry out the proposed research as illustrated in the resumes and list of contributions. Statistics Canada is not involved in the prioritization of projects undertaken in RDCs; however, it is the responsibility of each RDC to ensure that resources and funding is available prior to taking on a new research project.

Management Control Framework

An adequate management control framework is required to ensure effective planning, organizing, controlling, directing, communicating, as well as compliance with the Treasury Board Management Accountability Framework and the TBS Risk Management Policy. Clear program objectives should support strategic direction, operational plans and priorities, and should provide clear direction on how resources should be allocated to achieve these plans. Planning for the program involves the production of annual operational plans, budgets, staffing plans, and an annual RDC activity report, which are produced on a timely basis.

To ensure appropriate delegation of authorities, roles and responsibilities need to be documented and communicated. It was found that roles and responsibilities were clearly documented and understood. The organisational chart of the overall structure, including the relationships between the CRDCN and Statistics Canada, are up to date and indicate the linkages between the different stakeholders.

A communication process should be in place to ensure consistency of program activities and compliance to policies and regulations. The CRDCN, Statistics Canada and each RDC have their own websites, and communication for the program is done through the use of web technology. Communication between the RDC Regional Managers and the Statistics Canada analysts is informal. Items discussed are mainly related to RDC management and statistical operational processes. Operational issues not resolved at the regional level are escalated to the RDC Program Manager.

Values and ethics are promoted and communicated on an on-going basis within the researcher community through formal documentation, training, and policy frameworks. Microdata research contracts signed by researchers include several clauses regarding values and ethics. Also, researchers receive an orientation session where values and ethics are introduced and presented. Researchers acknowledge the "Researcher Guide", which includes a section on values and ethics. Values and ethics awareness is also reinforced through the Oath required under the Statistic Act.

An effective performance management system should be in place to measure and report on performance. Relevant performance targets should be identified, and information on results should be gathered and used to make departmental decisions. Program outcomes should also be reported as part of the Departmental Performance Report (DPR). The RDC Program Manager, who reports to the Director General of Census Subject Matter, Social and Demographic Statistics, is accountable for reporting on performance. Different reviews of the RDC initiative have been conducted, such as: the mid-term review produced by SSHRC and CIHR, the Quadrennial Program Review of the program produced by Statistics Canada for 2000-2008, the Program Manager Report, and Satisfaction Survey. Although results are measured and reported on for internal administrative purposes, performance is not measured against comprehensive program operational objectives, nor is it reported in Statistics Canada's DPR.

Investment Management/Funding

The overall initiative is funded primarily by Statistics Canada (through the Research Data Centre Program), the SSHRC, CIHR and the universities. SSHRC and CIHR provide grants to the CRDCN, which then re-distributes funds to the RDCs. An allocation formula is used as a basis to distribute funds, taking into consideration factors determining RDC activity levels such as the number of researchers, branches, and projects. Statistics Canada provides in-kind contributions, by not only providing access to data, but also by assuming the operational costs to manage the program. In the event that costs associated with administrating additional research projects exceed the limits set by the program, RDCs enter into an agreement with Statistics Canada for additional resources on a cost-recovery basis.

Every RDC has its own budget, which is managed by the Academic Director. The Academic Director is a university researcher and does not have authority over the Statistics Canada analysts. A large portion of the RDC budget consists of the cost associated with Statistics Canada analysts and for daily operations. Universities provide IT services and support to RDCs. Enhancements or development of new initiatives in centres are undertaken only if additional sources of funding are received.

The RDC program budget is intended to fund Statistics Canada's head office support for the RDC Network, including: the maintenance of the Management Information System on research activity conducted by deemed employees for Statistics Canada, the administration of contracts, maintenance of the RDC Web site, provision of the data to the RDCs, preparation and maintenance of the documentation required by researchers and RDC project staff, LAN support for head office, and the provision of methodological support to the RDCs.

The demand for Statistics Canada data access has grown over the last ten years, and increasing annual deficits have been incurred within the Program as the number of centres grew from 9 to 24. Over the past three years, deficits incurred by the RDC program were partially covered by funds available in other field programs.

Actual Research Data Centres Program (RDC) Expenditures for fiscal years 2007/2008, 2008/2009, and 2009/2010, and target budget for 2010/2011 and 2011/2012.
RDC Program Expenditures Actuals Going forward
Budget/TargetFootnote 1
2007/2008 2008/2009 2009/2010 2010/2011 2011/2012
Base (PE 1884)  
ITSDFootnote 2 138,689 112,592 182,320 - -
Salary 364,181 697,726 595,040 711,640 711,640
Non-Salary 92,238 102,158 100,761 90,555 90,555
Prog. Admin Costs 595,108 912,476 878,121 802,195 802,195
Recoverable Expenses (PE 6587)  
ITSDFootnote 2 2,374 33,053 66,102    
Salary 1,920,355 1,943,262 2,126,833    
Non-Salary 458,614 330,798 220,105    
Total Recoverable CostsFootnote 3 2,381,343 2,307,113 2,413,040    
Total Base & RecoverableFootnote 3 2,976,451 3,219,589 3,291,161    
Minus: costs recovered -2,381,343 -2,307,113 -2,413,040    
Net Program Costs 595,108 912,476 878,121    
Budget/Target 527,000 527,000 527,000    
Surplus (Deficit) -68,108 -385,476 -351,121    

In April 2009, the RDC program was transferred to the Microdata Access Division. For the fiscal year 2009/10, the deficit incurred for the research data centre initiative reached $351,121. In order to remedy this situation, a long-term proposal has been submitted to ensure on-going funding to cover head office costs. Going-forward, the annual funding for the program has been set at $800,000 annually.

From this time forward, RDCs are expected to maintain operational costs within the scope of their budgets.

Recommendation No. 1

It is recommended that the Assistant Chief Statistician (ACS) of Social, Health, and Labour Statistics ensure that performance indicators are measured against operational targets set for the program and that program results are included in the DPR for future target references.

Management Response

Agreed. Performance indicator measures and results are now an integral part of the Performance Program Review with detail in the program logic model of outcomes and how they are measured. Microdata and Access Divison will thus include them in its upcoming Performance Program Review. Performance of the RDC program has been positively reviewed through an external international panel and through client satisfaction feedback.

Deliverables:

  • Performance indicators and measures identified in the Performance Program Review and Departmental Performance Report - by March 31, 2011; and
  • Program results identified in the Performance Program Review and Report on Plans and Priorities - by March 31, 2011.

Risk Management

The RDC program and RDC sites do not have formal risk management practices, and risks associated to the continuity of the operations of RDCs have not been assessed as part of overall risk management practices for the program.

In a well-controlled program environment, management should have a solid and up-to-date understanding of the internal and external factors that may expose their strategic and operational objectives to risk. Resources and strategic risks should be monitored proactively to assist in decision-making and Statistics Canada's Long Term Planning process. Formal risk management practices enable program managers to identify, assess, monitor and report on risks that may result in threat or opportunity. Although some risks for the RDC program are discussed informally in meetings, there is no evidence of a risk management framework in place or a link to Statistics Canada's Risk Management Model. In the absence of a formal risk management framework in place, management's ability to identify and influence risk throughout the program lifecycle is weakened.

The policy on government Security states: "Continuity of government operations and services is maintained in the presence of security incidents, disruptions or emergencies". The departmental Business Continuity Plan policy also states that "it is the responsibility of the RDC Manager to develop and implement a Business Continuity Plan for each of the Research Data Centres and integrates the plan with those developed by the institution or university in which the centre is located". Although the RDC program is not a mission-critical project, risks associated to the continuity of the operations of RDCs should be identified and assessed as part of the overall sound risk management practices for the program.

Recommendation No. 2

The Assistant Chief Statistician (ACS) of Social, Health and Labour Statistics should ensure that the RDC program be included in Statistics Canada's Risk Management Model and that risks associated to the continuity of the operations of RDCs are identified and assessed as part of the overall risk management practices for the program.

Management Response

Agreed. The RDC program has identified program risks for Statistics Canada's Risk Management document in July 2010. Risks associated with RDC operations will be an integral part of the Performance Program Review, including completion of the Risk Register. The program manager will continue work on a business continuity plan for the program.

Deliverables:

  • Inclusion in Statistics Canada's Risk Management document, a Business Continuity Plan, and a risk assessment in the Performance Program Review. - by March 31, 2011.

Access to data

The audit found that researchers conduct their work in restricted physical and technological infrastructures that are secure, which reduces the risk of confidentiality breaches. Information handled in RDCs is subject to governmental confidentiality and security requirements. Nevertheless, RDCs have not been inspected periodically for compliance to security requirements. After conducting a number of security checks on site, some deviations from security requirements were observed. Security training is provided to new analysts, but there was no evidence of continued awareness programs. Finally, the disclosure process in place for the program is applied consistently, rigorously and consistently, and no unwanted disclosures were observed.

It is expected that controls over access to Statistics Canada data in RDCs are in compliance with Government and Statistics Canada policies and procedures on security and confidentiality. Applications are submitted through the Social Sciences and Humanities Research Council, and assessed within Statistics Canada through the SBSD, SM Area, DACS, and the RDC Program divisions. Once access has been granted, Universities provide RDC space and Statistics Canada ensures the site complies with departmental physical security and confidentiality policies. The audit found that researchers conduct their work in restricted physical and technological infrastructures that are secure, which reduces the risk of confidentiality breaches, and no unwanted disclosures were observed. Information handled in RDCs is subject to governmental confidentiality and security requirements.

Infrastructure

Prior to approving the provision of access to Statistics Canada data to RDCs, physical and technological infrastructure of RDCs must meet the security requirements of the department. The universities provide physical sites as well as IT support, Statistics Canada ensures the sites meet the departmental security requirements which are based on the TBS Policy on Government Security. To review the security of the infrastructure, a security inspection checklist has been developed. Inspections include physical security infrastructure items such as site access, locks, and keys; and electronic security infrastructure items such as server access, and passwords. This security inspection checklist is used to ensure departmental security requirements are applied when a new RDC/branch is opened. In summer 2009, the security inspection checklist was reviewed by the Data Access and Control Services Division to ensure that the requirements of the new Policy on Government Security continued to be met. A review of a sample of completed security inspection checklists revealed several forms that were only partially completed; consequently, there was insufficient evidence that complete security inspections were conducted.

The TBS Policy on Government Security referring to the Directive on Departmental Security Management recommends regular security inspections of working sites in order to identify potential security risks. In reality, RDCs have not been re-inspected since their opening, in some cases for the last ten years. After conducting a number of security checks on site, some deviations from security requirements were observed. As an example, the Statistics Canada Security Practices Manual states that "to protect the integrity and data availability of the Agency's information assets, all information should be on servers and network drives that receive daily automatic backup". The audit found that, in some instances, backup storage procedures were not consistently applied as described in the Manual.

The TBS Policy on Government Security states that "A departmental security awareness program covering all aspects of departmental and government security must be established, managed, delivered and maintained to ensure that individuals are informed and regularly reminded of security issues and concerns and of their security responsibilities." Departmental requirement for security training has been enforced through the KLICK security course offered to RDC analysts; however, there was no evidence of mechanisms in place to ensure continued awareness among Statistics Canada analysts since the opening of the centres.

The demand for greater access to data continues to increase. In response, Statistics Canada's program management is examining other means of data access such as remote access and synthetic files as a means to provide improved data access.

Confidentiality

One of Statistics Canada's key values is the preservation of the confidentiality of its data and the respondents. Data handled by RDCs must also meet departmental confidentiality and security requirements. Data requested by researchers is extracted in Headquarters, encrypted on disk and transmitted to RDCs by courier, using confidentiality rules for protected B information. Data is decrypted by the Statistics Canada analyst and installed on the RDC server, which is not connected to a network. Researchers obtain access to Statistics Canada data samples through individual user accounts for each of their research projects. Disclosure guidelines are implemented within the department to avoid publication of protected information to the public. To ensure Statistics Canada information is protected within RDCs, a disclosure analysis process conducted by the Statistics Canada analysts is performed prior to the publishing of results. Researchers submit their information to the analysts, who then review and remove information that could lead to a breach of confidentiality. The audit found that the disclosure process in place for the program is applied rigorously and consistently.

Recommendation No. 3

The Assistant Chief Statistician (ACS) of Social, Health and Labour Statistics should work with Corporate Services to ensure that regular inspections are instituted to ensure infrastructure surrounding the access to data continue to meet requirements of the departmental security policy

Management Response

Agreed. In addition to completeinitial site inspections, the RDC program will conduct regular inspections of all RDCs. The Microdata Access Division has in place a number of procedures to safeguard the physical and electronic security of the centres and their data.

Deliverables:

  • Inspector from corporate services will report on the security of RDCs by submitting the completed security checklist - The schedule will start with the first re-inspections of the RDCs in 2011-2012 Fiscal Year. All current RDCs will be re-inspected by 2015-2016.

Appendices

Appendix A: Audit Objectives and their Criteria

Objective 1: The activities supporting the Governance Framework are adequate and effective to ensure that services are delivered to researchers.

Governance and Strategic Direction: Review the adequacy of the management control framework (MCF) of the program including processes and practices related to planning, organizing, controlling, directing and communicating.

  • Relationship exists between the strategic plan and the objectives of the programs.
  • Roles/responsibilities are defined and communicated through the program.
  • A clear organizational structure is established, documented and reporting relationships are effective.
  • The values and ethics are promoted among stakeholders.
  • There is a communication process in place to communicate program activities.
  • There is an effective performance management and accountability framework in place to measure and report on performance.

Risk Management:

  • A risk management mechanism should exist to identify, assess, monitor and report on risks.

Investment Management/Funding:

  • The current funding planning/budgeting processes are effective. Expected costs are defined and assessed based on expected benefits.

Objective 2: Access to data follows Government and Statistics Canada policies and procedures.

Infrastructure:

  • Physical and technological infrastructures are secure.

Information Security and Confidentiality:

  • Electronic data access is controlled efficiently.
  • Confidentiality is maintained, data are adequately protected.
  • Unwanted disclosure is avoided.

Business Continuity Planning:

  • Business Continuity should be defined to assure continuity of the operations.

Appendix B: Overview of the RDC Network

Overview of the Research Data Centres Network

Reference: Quadrennial Program Review: 2000 - 2008, page 19.

How to Make an Access to Information Request or a Personal Information Request

You can submit your Access to Information or Privacy request online or by mail.

On-line Request

Using the ATIP online Request service, is a faster, easier and more convenient way to submit access to information or privacy requests. Apply online today to save time and postage.

Mailing your Request

Although forms exist for the submission of both access to information and privacy requests, using these forms is not required. You may submit your request in the form of a letter as long as you clearly explain what information you are seeking and under which Act.

There are no fees associated with privacy requests. Access to information requests must be accompanied by a nonrefundable $5 application fee as prescribed by Section 7(1)(a) of the regulations of the Act. Payment may be in the form of a cheque or money order made payable to the Receiver General of Canada or cash.

Before you make a Request

Please note that this is not a service to request statistical information. For statistical information, please contact Statistics Canada's Statistical Information Service by telephone at 1-800-263-1136 or by e-mail at infostats@statcan.gc.ca. The statistical information may already be published or available for purchase by the public and is excluded under the Access to Information Act.

If you are unsure whether your request would be considered an Access to Information Request or a statistical information request, please consult with the Statistics Canada Access to Information and Privacy Coordinator by email at statcan.atip-aiprp.statcan@statcan.gc.ca before you submit your request.

If you want information about Census and/or the 1940 National Registration, you may go directly to this link "Application and Authorization".

Access to Information Request Form

Personal Information Request Form

Please send your request to

Pierre Desrochers
Chief Privacy Officer
Office of Privacy Management and Information Coordination
Statistics Canada
R.H. Coats Building, 2nd floor
100 Tunney's Pasture Driveway
Ottawa, Ontario K1A 0T6

If you need assistance, contact us using one of the following:

Email address: statcan.atip-aiprp.statcan@statcan.gc.ca
Phone: 613-894-4086

Principles for assisting applicants

In processing your access request under the Access to Information Act, we will:

  1. Process your request without regard to your identity.
  2. Offer reasonable assistance throughout the request process.
  3. Provide information on the Access to Information Act, including information on the processing of your request and your right to complain to the Information Commissioner of Canada.
  4. Inform you as appropriate and without undue delay when your request needs to be clarified.
  5. Make every reasonable effort to locate and retrieve the requested records under the control of the government institution.
  6. Apply limited and specific exemptions to the requested records.
  7. Provide accurate and complete responses.
  8. Provide timely access to the requested information.
  9. Provide records in the format and official language requested, as appropriate.
  10. Provide an appropriate location within the government institution to examine the requested information.

North American Product Classification System (NAPCS) Canada 2012 Version 1.1

Introduction

NAPCS Canada 2012 Version 1.1 updates NAPCS Canada 2012 Version 1.0. In total, there are 271 changes between the two versions. Some categories were split, and others were merged. New categories were incorporated, and some were deleted, for a net addition of 65 product categories at different levels, providing better alignment with survey data collection and publication. In some instances, these modifications created the need to renumber codes. These changes account for about half of the update. The remaining changes relate to editing of class titles adding precision to their formulations. The detailed list of changes can be obtained from Standards Division at standards-normes@statcan.gc.ca.

Standard classification structure

The standard classification structure of NAPCS Canada 2012 comprises four levels: group, class, subclass, and detail. The table below outlines the nomenclature and provides the number of categories within each level of NAPCS Canada 2012 versions 1.1 and 1.0.

Standard classification structure of NAPCS Canada 2012
Level Coding Number of categories NAPCS 2012 Version 1.1 Number of categories NAPCS 2012 Version 1.0
Group 3-digit codes 158 158
Class 5-digit codes 511 510
Subclass 6-digit codes 1,402 1,398
Detail 7-digit codes 2,694 2,648
Table source: Statistics Canada, NAPCS.

Classification variants

Along with NAPCS Canada 2012 Version 1.1, two new regrouping variants are made available: one for the Industrial Product Price Index (IPPI) and one for the Raw Materials Price Index (RMPI). These variants add one level (section) above the standard classification structure; this new level is defined in terms of standard groups (three-digit). The tables below illustrate the nomenclature of the IPPI and RMPI variants and provide the number of categories within each level.

Levels for IPPI variant
Levels for IPPI variant Coding Number of categories
Section 3-character alphanumeric codes 21
Group 3-digit standard codes, and 4-character alphanumeric codes 79
Class 5-digit standard codes, and 6-character alphanumeric codes 241
Subclass 6-digit standard codes 665
Detail 7-digit standard codes 1,190
Table source: Statistics Canada, NAPCS.
Levels for RMPI variant
Levels for RMPI variant Coding Number of categories
Section 3-character alphanumeric codes 6
Group 3-digit standard codes, and 4-character alphanumeric codes 21
Class 5-digit standard codes, and 6-character alphanumeric codes 44
Subclass 6-digit standard codes 90
Detail 7-digit standard codes 197
Table source: Statistics Canada, NAPCS.

At the time of publishing this note, the plan is to create two regrouping variants for capital expenditures and one extension variant for agricultural products.