Audit Report
June 2018
Project Number: 80590-102
- Executive summary
- Conformance with professional standards
- Introduction
- Background
- Audit objective
- Scope
- Approach and methodology
- Authority
- Findings, recommendations and management response
- Development and implementation of test strategies and plans for the census processing phase
- Development, approval and implementation of certification strategies
- Appendices
- Appendix A: Audit criteria
- Appendix B: Initialisms
Executive summary
The Statistics Act requires that Statistics Canada collect, compile, analyze and publish statistical information on the economic, social, and general conditions of the country and its people. The Census Program is the largest data collection activity conducted by Statistics Canada to meet its obligations under the Act.
The Census Program provides a statistical portrait of the country and its population every five years. Given its importance, the management of data quality is key to ensuring that the program's products are of sufficient quality for their intended uses. Various evaluations and audits over time have confirmed that the Agency has robust processes and systems in place overall to ensure the production of quality census information.
For the 2016 Census, there were significant changes and upgrades to the program's key systems, sub-systems and applications. These changes included the reinstatement of the long form questionnaire and the introduction of the Integrated Collection and Operation Systems (ICOS) to collect census data. The changes and improvements made undergo a testing plan prior to implementation.
In addition to the testing of changes to systems, Statistics Canada employs a certification process to verify the validity of its data to ensure they are reliable, sound and defensible. This process identifies and corrects inconsistencies in data using diagnostic tools and subject- matter expertise. This area is governed by the Statistics Canada Directive on the Validation of Statistical Outputs.
Why is this important?
The Census Program is the largest program within Statistics Canada. Census data are used by governments at all levels for program planning, analysis and decision making including federal transfer payments to the provinces and territories. In addition, the data are used by other major users such as private businesses, non-governmental organizations, and academic researchers. At Statistics Canada, Census data are used in other surveys for sampling and benchmarking.
The Agency recognizes the need to ensure that census data are of sufficient quality and produced in accordance with Statistics Canada's Quality Assurance Framework. The audit was included in the risk-based audit plan to provide reasonable assurance that the design and implementation of the quality assurance framework for the Census Program is working effectively. Given the large number of processes and systems in place for the census, the scope was limited to providing assurance that the following processes were in place:
- an effectively designed testing process for changes that impact processing, editing, and imputation activities
- a certification process for the review of census outputs.
Key findings
The Census Program has developed and approved strategies for testing system changes to ensure that systems are operating as intended.
Census task managers and subject matter experts have created test plans to carry-out testing of processing activities; input is provided by working groups on testing approaches. There is no documented review or approval of testing plans prior to implementation.
There is limited evidence of oversight and accountability to ensure that all testing has been completed and that all the issues identified through testing have been addressed.
Census certification strategies have been developed and approved in order to validate data prior to its release to the public.
Mechanisms are in place to document the results of certification processes; the identification and resolution of issues identified through certification are not always documented.
Roles and responsibilities are not always segregated between those responsible for data processing and those responsible for certification in-line with data quality guidelines.
Overall conclusion
The Census Program requires enhancements to its quality control framework to improve the effectiveness of its testing and certification processes.
While testing strategies are developed and reviewed, documentation is not sufficient to validate that all testing has been completed as planned and all known issues have been addressed.
The certification processes have well-developed strategies, but would be further improved through the leveraging of external expertise, the segregation of roles and responsibilities, and the documentation of identified issues.
Conformance with professional standards
The audit was conducted in accordance with the Internal Auditing Standards for the Government of Canada, which includes the Institute of Internal Auditors (IIA) International Standards for the Professional Practice of Internal Auditing.
Sufficient and appropriate audit procedures have been conducted and evidence gathered to support the accuracy of the findings and conclusions in this report and to provide an audit level of assurance. The findings and conclusions are based on a comparison of the conditions as they existed at the time, against pre-established audit criteria. The findings and conclusions are applicable to the entity examined, and for the scope and time period covered by the audit.
Steven McRoberts
Chief Audit and Evaluation Executive
Introduction
Background
The Statistics Act requires that Statistics Canada collect, compile, analyze and publish statistical information on the economic, social, and general conditions of the country and its people. The Census Program is the largest data collection activity conducted by Statistics Canada to meet its obligations under the Act.
The Census Program provides a statistical portrait of the country and its population every five years. Given the importance of the program, the management of data quality is key to ensuring that the program's products are of sufficient quality for their intended uses.
The Census Program has designed its quality assurance systems and practices around the six dimensions of Statistics Canada's Quality Assurance Framework: relevance, accuracy, timeliness, accessibility, interpretability, and coherence.
Data quality is further supported by a governance committee structure including
- senior management committees
- the Census Steering Committee
- the Census Project Team (CPT)
- the Data Quality Review Board
- the Tolerance Management Working Group
- the Coverage Measurement and Improvement Project.
For the 2016 Census, there were significant changes and upgrades to the Census Program's key systems, sub-systems and applications. These changes included the reinstatement of the Long Form Census and the introduction of the ICOS to collect census data. Changes and improvements made undergo a testing plan prior to implementation.
In addition to the testing of changes to systems, Statistics Canada employs a certification process to test the validity of its data to ensure they are reliable, sound and defensible. This process identifies and corrects inconsistencies in data using diagnostic tools and subject-matter expertise. This area is governed by the Statistics Canada Directive on the Validation of Statistical Outputs.
Audit objective
The objective of the audit was to provide the Chief Statistician and Statistics Canada's Departmental Audit Committee with reasonable assurance that
- there is an effective management control framework in place to manage the quality of census data in line with Statistics Canada's Quality Guidelines.
Scope
The scope of this audit included an examination of the quality control framework for the 2016 Census Program. The scope was limited to providing assurance that there was an effectively designed testing process for changes that impact the processing, editing, and imputation activities, and that there is a certification process in place for the review of census outputs.
Approach and methodology
The audit approach included an assessment and analysis of relevant documentation, as well as interviews with key management and staff across the Census Program and within Subject Matter Divisions.
The audit team examined the governance, risk management and control processes in place to manage testing activities for the census processing and edit and imputation systems, as well as for the certification activities.
This audit was performed in accordance with the Internal Auditing Standards for the Government of Canada, which includes the IIA International Professional Practices Framework.
As part of the processing, edit and imputation and certification activities, samples of key control areas were selected to perform various tests demonstrating
- how testing plans and strategies are developed
- the oversight and challenge function established to oversee the development of the testing strategies and plans
- the completion of testing as described
- the oversight and monitoring of testing results
- reporting and approval of testing results.
Authority
The audit was conducted under the authority of the approved Statistics Canada integrated Risk-Based Audit and Evaluation Plan 2017/2018 to 2019/2020.
Findings, recommendations and management response
Development and implementation of test strategies and plans for the census processing phase
The Census Program has developed and approved strategies for testing system changes to ensure that systems are operating as intended.
Census task managers and subject matter experts have created test plans to carry-out testing of processing activities; input is provided by working groups on testing approaches. There is no documented review or approval of testing plans prior to implementation.
There is limited evidence of oversight and accountability to ensure that all testing has been completed and that all the issues identified through testing have been addressed.
Upgrades and changes are made to census information technology systems within each Census cycle. These changes should be tested to ensure that they have been effectively implemented and that the system is operating as intended.
An effective management control framework for testing should be in place. The framework should include the following:
- an approved testing strategy
- plan that outlines the key tests that will be conducted
- a clear accountability process to ensure that testing has been carried out
- process to ensure that errors identified in testing were adequately addressed.
The census processing phase
Census data are processed in a number of phases including receipt and registration, imaging and data capture, edits, coding, and edit and imputation.
Once information has been collected from the census respondents either through a paper questionnaire or an online questionnaire, it is registered to ensure it is linked to a specific dwelling and then the information is either scanned (i.e., paper questionnaire) or automatically uploaded (i.e., electronic questionnaire) and the data are captured in the system.
Questionnaires that have been captured in the system are then reviewed in the first edit phase to address questionnaires that contain potential inconsistencies. These include instances where questions were left blank, responses are inconsistent with the number of household members reported, and where further follow-up for content or coverage is required.
Completed responses are all given a code based on a standard classification structure. Coded census responses then undergo the edit and imputation phase to address omissions and inconsistencies in forms not covered in the first edit phase. This final editing process detects errors and imputes (assigns) responses to address these errors or inconsistencies.
Testing the census processing phase
A number of tests were conducted for the Census Processing Phase to ensure that information systems are operating properly and producing the desired results.
The audit team reviewed the testing approach for three key areas:
- System acceptance testing (SAT)– SAT is a level of software testing where the system is tested for acceptability. The purpose of the test is to determine whether business requirements (i.e., expected results according to the system design) are met. This functional testing will check the behaviour of the system using representative data. As identified in the Census Processing System (CPS) Integrated Test Plan, SAT is some of the most comprehensive testing conducted on the CPS system.
- System integration testing (SIT)– These are low volume tests of the integration points between two or more information systems. Per the CPS Integrated Test Plan, SIT ensures accurate processing by permitting the evaluation of the data transferred between systems to ensure conformity to the business requirements (i.e., expected results according to system design).
- Edit and imputation (E&I) testing – The E&I process undergoes two types of tests:
- Pre-production testing - Each SMD is responsible for testing a specific number of census variables (e.g., language, sex) by performing analysis of monitoring tables produced using the production data and decision logic tables of the previous census
- Overall testing (operational testing) – The Census Operations Division (COD) tests that the front-end processes leading to the population & dwelling counts operate as per expectations.
In all three testing areas, the audit team reviewed testing approaches to ensure there was an effective control framework in place.
An effective control framework would include a testing strategy that identifies risks to be addressed through testing. It would also include an approved testing plan that will operationalize the testing strategy. Lastly, it would include a process to ensure that all testing was completed and all issues were addressed.
Testing strategies were developed and approved for SAT, SIT and E&I.
Review of documents showed that the COD had developed the CPS Integrated Test Plan (which covers all areas of census processing outside of E&I) to govern the planning and control of all testing efforts for the processing of all census data. The document outlined the overall testing strategy, including the various types of tests undertaken to ensure that the 2016 CPS functions as expected. The document also defined the sequence in which the tests were to be performed and assigned key roles and responsibilities. This document was approved by the CPT.
There was also a testing strategy developed for both the pre-production testing plan and the operational testing plan for the edit and imputation process. The strategy was reviewed and approved by the CPT.
Census task managers and subject matter experts have created test plans to carry-out testing of processing activities; input is provided by working groups on testing approaches. There is limited documented review or approval of testing plans prior to implementation.
Systems acceptance testing
Census task managers were responsible for designing SAT plans and test cases to verify that system changes have been correctly implemented. Interviews indicated that test plans and test cases are reviewed and challenged prior to implementation but this process is not documented.
Interviews with census management revealed that a working group was created. The working group is comprised of a number of functional area representatives, Subject Matter Experts and the Processing Services Task Manager. It identified and discussed the potential issues or errors that may arise with the changes brought to systems, but these quality assurance activities were not documented.
System integrated testing
SIT ensures that data transferred between systems are accurate and complete. The SIT roles and responsibilities are outlined in the CPS Integrated Test Plan which specifies that SIT is controlled by the SIA Chief and coordinated by the SIA Test Coordinator.
The SIA who is responsible for SIT testing, documented the integration points between the systems in a master integration sheet, used to identify the areas requiring SIT. A review of the integration sheet confirmed that it was used to document the status of the testing.
In most cases, the description of the specific SIT to be performed was not documented. Additionally, the integration sheet did not identify the potential issues or errors to be mitigated by the SIT, the proposed testing approach, or which test cases were to be performed.
Interviews and a review of minutes of meetings indicated that the Technical Review Board reviewed and provided input into the testing methodology. The Board therefore provided some oversight over the test development, but it did not formally approve it.
Edit and imputation testing
The SMDs documented their overall testing approach in a narrative description. This approach was used for pre-production testing and outlined their proposed analysis. The COD developed test plans that outlined the objectives and expected outcomes of their operational tests. Interviews and a review of documentation, indicated that in both cases, insufficient evidence was present to confirm that test plans were reviewed or challenged to ensure they were comprehensive enough to address all identified risks.
In 2011, the Quality Assurance Review Report of the Census Edit and Imputation (2011) noted this same issue. The report indicated that neither the E&I testing strategies nor the test plans outline criteria/rules for determining when sufficient testing or analysis had been performed and who should have been challenging that decision.
All three testing areas lack a formal sign off of tests plans and test cases to ensure that sufficient testing has been designed in order to address all system risks. Without this review and approval, issues stemming from system changes are at risk of not being addressed.
There is limited evidence of oversight to ensure that testing has been completed and that all the issues identified through testing have been addressed.
System acceptance testing
Completion reports were prepared to highlight the SAT results. The reports include the risks and issues identified through the testing, as well as any relevant recommendations. The audit team randomly selected three test plans and confirmed that completion reports were prepared and that the results were supported by sampled test cases.
The audit noted a sign-off process to confirm that a completion report was done for each test. Review of these reports showed evidence that the tests were completed; however none of the signature boxes confirming that they had been reviewed were completed. Census management could not confirm that all testing results had in fact been reviewed.
Interviews with census management indicated that because of the volume of tests that are performed, reviewing and approving the completion of all testing at the individual test case level may not have been feasible.
A review of the process indicated that if a potential system error or issue was identified through testing, a ticket is created within the Agency's JIRA tracking tool to both investigate and address the identified weakness. Once the issue has been resolved, it is marked as closed in the system and the test is re-run. Evidence of the re-performance of the test is maintained within the JIRA system.
System integrated testing
The master integration sheet, maintained by the SIA, was used to document the status of the testing, and its results. Documentation within the master integration sheet indicated that the SIT process was partially completed by SIA, as well as by other sub-system teams. This document did not outline which strategies were to be undertaken to test the integration points where it included only a listing of the integration points and limited documentation on what was performed. Interviews with the SIT coordinator also confirmed that they did not record results of testing within the integration sheet because of difficulties experienced obtaining information from the sub-system teams.
Edit and imputation testing
The audit team was unable to validate that planned testing was performed and that the results of that testing were reviewed. Some of the SMDs sampled did not maintain documentation to confirm that E&I pre-production testing was reviewed and approved. Additionally, inconsistencies were identified regarding supporting analysis used to conclude on the testing.
Interviews confirmed that due to the decentralized nature of the E&I testing, there is no mechanism to ensure that all planned testing was performed per expectations. Additionally there is no requirement for SMDs to report to COD on the results of their testing COD for the consolidation of E&I test results.
The issues and risks identified through E&I testing were not documented or tracked by most SMDs. Through interviews, the audit established that one of the sampled SMDs maintained its own issues log; however, another SMD indicated that it did not document potential issues identified through testing.
All testing processes lack a formal sign off to ensure that all testing has been completed as planned. There is also limited documentation and tracking of issues identified through testing. Without effective processes to provide oversight on testing completion, identified tests are at risk of not being carried out on the systems prior to their implementation.
Recommendations
The Assistant Chief Statistician, Census, Operations and Communications should ensure that
1. a process is in place to document the approval of the design and development of testing plans and tests cases to ensure they are aligned with identified system risks.
2. for SIT and E&I testing, accountability is assigned for confirming that all testing has been completed as planned, and for tracking and addressing issues identified through testing.
Management response
Management agrees with the recommendations.
- 1. Documentation for the design and development of testing plans and cases will be improved to clearly document alignment between identified system risks and testing plans.
The review and approval process for test plans and cases will be adjusted to ensure that approval of test plans is documented.
Deliverables and timeline
The Director General, Census Management Office will
- provide updated guidelines to all managers responsible for testing plans by September 2018.
- 2. Testing will be delegated to the sub-project, task and sub-task managers within the census program:
- schedules will be put in place by managers at the appropriate level to formally verify the completion status of all test cases in test case inventories as identified within test plans
- documentation of test case completion will be improved to ensure the reporting of completion and the tracking and addressing of issues identified
- an adjusted approach will be applied as a good practice to all systems used in the Census Program.
Deliverables and timeline
The Manager, Processing Sub-projects and Manager, Systems Integration Sub-projects will
- schedule reviews of testing results for E&I testing and SIT by September 2018.
- report on test plan implementation throughout the project, including a confirmation of the completion of tests and resolution of issues
- issue the first status reports on SIT by October 2018.
Development, approval and implementation of certification strategies
Census certification strategies have been developed and approved to validate data prior to its release to the public.
Mechanisms are in place to document the results of certification processes; the identification and resolution of issues identified through certification are not always documented.
Roles and responsibilities are not always segregated between those responsible for data processing and those responsible for certification in-line with data quality guidelines.
Clearly defined certification strategies, considering both internal and external data sources, are key to ensuring that data are validated in an effective and efficient manner. Certification is a key control to ensuring that census systems are working appropriately and that data can be relied upon prior to public consumption.
The census certification process
Prior to the dissemination of census results, census data undergoes a rigorous quality control process. One of the key steps in this process is the certification of census results.
The goal of certification is to assess the quality of the census data at specific levels of geography in order to ensure that quality standards for public release are met. Subject Matter Experts will review a number of factors to assess the quality of the data derived from the census. This includes response and error rates, comparison of the data to the previous Census, as well as comparing them with those from other surveys and administrative sources. All of these sources help validate the census results and whether they can be relied upon and released.
Certification strategies were developed and approved; subject matter divisions leveraged external data sources when they were available.
A review of documents confirmed that two overarching certification strategies were developed to guide certification activities across all SMDs: the Population and Dwelling Counts Certification Strategy; and the Census Population Characteristics Certification. Leveraging these strategies, each individual SMD developed its own certification plan in relation to its respective census variables (e.g., age, sex). This process is coordinated by the Census Subject Matter Secretariat (CSMS).
The audit team sampled and reviewed the certification strategies developed by two SMDs for the 2016 Census. The documents outlined the certification objectives, proposed certification activities and timeframes, as well as key changes since the last census. The strategies were reviewed by the SMD Director, and sent to CSMS for comments.
The Guideline for the Validation of Statistical Outputs ("Validation Guideline") indicates that data validation should consider confrontation with other similar data sources, either published by Statistics Canada or external sources, and consultation with external stakeholders.
Through the review of sampled certification strategies, the audit team confirmed that SMDs are leveraging internal and external data sources to validate the reasonableness of census data. As a result, when available, variables are reviewed or validated with data from other government departments. As a good practice, consideration could also be given to leveraging external expertise to validate census data.
Mechanisms are in place to document the results of certification processes; the identification and resolution of issues identified through certification are not always documented.
Once the certification process is complete, each SMD is responsible for producing a certification results deck, which is presented to
- their Director
- CSMS management
- the Director General of Census Subject Matter, Social and Demographic Statistics Branch
- the Director General of the Census Program
- the Assistant Chief Statistician of Social, Health and Labour Statistics
- the Chief Statistician.
This deck presents the results, including potential errors identified, through the certification process. Additionally, a Certification task summary report is prepared and approved by the CSMS Assistant Director, the director of the SMD, the Director General of Census Subject Matter, Social and Demographic Statistics Branch, the Director General of Census Program, and the ACS of Social, Health and Labour Statistics. The document includes the data quality indicators, any problems encountered and a summary of recommendations.
Lastly, during the certification process, potential issues in the data are identified and further investigated. However, not all sampled SMDs were documenting potential issues and how they were resolved. Documentation supporting the certification of results where potential issues are identified provides assurance to census management that all certification was performed as planned. This decreases the risk that an anomaly will go undetected.
Separation of duties between certification and edit and imputation processing is not always in place.
The Validation Guideline stipulates that "where possible, and when resources permit, ideally two separate teams could exist between the data processing of edit and imputation, and the subject matter experts doing the certification. This separation of duties allows for an independent assessment of the statistical estimates."
The audit revealed that, within certain SMDs, the same individuals who established the E&I parameters, also developed and executed on the certification strategies. The certification process is designed to be a challenge function where the results are independently reviewed. This independence reduces any bias in the review of results and increases the likelihood that anomalies will be identified.
Interviews revealed that SMDs may be the best individuals to carry out the certification process since they are experts in these areas. Therefore, with a limited number of subject matter experts specifically assigned to certification activities, an independent review may not always be possible.
However, the audit identified at least one good practice. In one of the sampled SMDs, the certification process is performed simultaneously by two subject matter experts, and the results are compared. Through this peer review process, the potential objectivity impairment of the subject matter expert involved in the E&I process is mitigated through an independent assessment of the certification results.
Recommendations:
The Assistant Chief Statistician, Census, Operations and Informatics in collaboration with the Assistant Chief Statistician, Social, Health and Labour Field should ensure that
3. accountability is assigned to verify that certification strategies were carried out as intended.
4. a mechanism is in place to mitigate segregation of duties risks between E&I and certification of results.
Management response
Management agrees with the recommendations.
3. After the third release of 2016 Census data in August 2017, changes were implemented for the certification of the data for the remaining releases. A committee to review the certification results was put in place. The main objectives of the committee were to
- meet with the different census subject matter analysts and ask them to share and present the results of Certification
- ensure that for all the items listed in each certification strategy, the analyses had been undertaken; and the results of these analyses were presented to the committee for review
- question and challenge important differences found in the data results and suggest areas of further investigation.
In addition to this, consultations with experts from other federal departments were undertaken for each census release after August 2017. Subject matter analysts presented and discussed the estimates and trends with these experts from other federal departments.
For the 2021 Census, the Assistant Chief Statistician, Social, Health and Labour Statistics Field will ensure that the governance structure is modified to implement a similar committee. The committee will be responsible to review, validate and monitor the activities and results of the certification throughout the certification period and ensure that the certification activities are carried out as planned.
Deliverables and timeline
The Director General, Census Subject Matter, Social and Demographic Statistics Branch will
- establish detailed terms of reference for the new committee, including its mandate, roles and responsibilities, membership, schedule and action plan, including consultations with experts from other federal departments. The terms of reference will be approved by the Director General of Census Subject Matter and the Census Manager and be presented to and approved by the 2021 Census Steering Committee by March 31, 2019.
- 4. The Assistant Chief Statistician, Social, Health and Labour Field will ensure that all subject matter areas involved in census activities, have clearly identified certification activities and segregation of duties, from other activities. This will ensure that certification activities maintain their challenge function in assessing the quality of census results, and that all the different steps specified in the certification strategies are conducted appropriately.
Deliverables and timeline
The Director General, Census Subject Matter, Social and Demographic Statistics Branch will
- ensure that directors of census subject matter areas obtain approval from the Assistant Chief Statistician, Social, Health and Labour Statistics Field, the Director General of Census Subject Matter and the Census Manager, for their strategies. This includes risk mitigation and resource allocation plans for the 2021 Census to ensure the segregation of duties and that risk mitigation strategies are in place. Approved plans will be presented to the 2021 Census Steering Committee by March 31, 2019.
Appendices
Appendix A: Audit criteria
Control objectives / Core controls / Criteria | Sub-criteria | Policy instruments/Sources |
---|---|---|
Objective 1: There is an effective management control framework in place to manage the quality of census data in line with Statistics Canada's Quality Guidelines. | ||
1.1 An oversight process has been established to monitor the development of testing plans, certification strategy and the results of testing and certification activities. | 1.1.1 Roles and responsibilities are clearly defined, understood and communicated. 1.1.2 There is a process for governance bodies to address errors/issues identified in the testing and certification process. 1.1.3 Testing plans and certification strategy are formally challenged by individuals with appropriate authority and expertise. |
Relevant Treasury Board legislative and regulatory requirements, Statistics Canada policies and procedures, such as:
|
1.2 Risk management processes are in place to identify, assess and respond to risks within testing plans and certification strategy and results. | 1.2.1 Risk within testing plans, certification strategy and results have been clearly identified and documented. 1.2.2 Risk mitigation strategies have been developed to address risk identified in testing plans, certification strategy and results. |
|
1.3 Testing plans and certification strategy are in place and issues/errors are identified within each key process. | 1.3.1 Formal test plans for processing and edit and imputation and certification strategy were developed, approved and communicated to key stakeholders. 1.3.2 Evidence that test and certification have been performed and that the results were reviewed by individuals with delegated authority. 1.3.3 Evidence that issues/errors identified during testing and certification of results were escalated to governance bodies and addressed in a timely manner. 1.3.4 A process exists for consultation with internal stakeholders to Statistics Canada and external stakeholders, using a consistent approach in line with Statistics Canada's guidelines and approved, to certify census results. 1.3.5 There is appropriate segregation of duties to ensure independent assessment of the certification activities. |
Appendix B: Initialisms
Initialism | Description |
---|---|
ACS | Assistant chief statistician |
COD | Census Operations Division |
CPS | Census Processing System |
CPT | Census Processing Team |
CSMS | Census Subject Matter Secretariat |
E&I | Edit and imputation |
IIA | Institute of Internal Auditors |
SAT | Systems acceptance testing |
SIT | Systems integration testing |
SIA | Systems Integration Authority |
SMD | Subject matter division |