Audit of Justice Statistics

September 14, 2015
Project Number: 80590-89

Executive Summary

Since 1981, the federal, provincial and territorial deputy ministers responsible for the justice system in Canada have been working together, along with the Chief Statistician of Canada, on the National Justice Statistics Initiative (NJSI). The mandate of the NJSI is to provide information to the justice community and the public on the nature and extent of crime and victimization, as well as the administration of criminal and civil justice in Canada.

Through the Canadian Centre for Justice Statistics (CCJS), a division of Statistics Canada, the efforts of the NJSI are directed toward the production of data to inform the legislative, policy, management and research agendas of federal–provincial–territorial partners, and to inform the public. The focus of the CCJS is the development, collection, integration and analysis of data that reflect trends in Canada and on the development of national- and jurisdictional-level indicators. There are three main statistical programs within the CCJS, the Policing Services Program, the Courts Program and the Correctional Services Program.

The objectives of the audit were to provide the Chief Statistician and the Departmental Audit Committee with assurance that

  • Statistics Canada has established an adequate governance framework to support the quality of Justice Statistics data
  • effective control mechanisms have been established and are consistently applied to ensure the release of quality data in accordance with the agency's Quality Guidelines.

The scope of the audit focused on CCJS' Policing Services Program's Uniform Crime Reporting (UCR) Survey, as well as the Courts Program's Integrated Criminal Court Survey (ICCS).

The audit was conducted by Internal Audit Division in accordance with the Government of Canada's Policy on Internal Audit.

Key findings

Roles, responsibilities, and accountabilities of the key personnel responsible for the quality of Justice Statistics are well understood within CCJS and by key external stakeholders. Several committees have been established to monitor the quality of CCJS data and formal terms of reference outlining their roles and responsibilities have been developed. However, a formal document, such as a CCJS accountability and governance framework, has not been developed to provide an overview of all of the key roles, responsibilities and accountabilities within the CCJS directorate, as well as the roles and responsibilities of governance committees and other stakeholders, including their interdependencies.

Outside of the work performed by the agency's Quality Secretariat, which exists to promote sound quality management practices for Statistics Canada program areas based on the agency's Quality Guidelines, there is also an internal Data Quality Secretariat (DQS) within CCJS, which was established in 2014. Although the CCJS' DQS has been working on completing a number of quality reviews on its own, their role, responsibilities and accountabilities have not been formally defined.

Although high-level risks have been identified by the division for the development of the agency's Corporate Risk Profile, the division's risk register has not been updated since March 2012. No formal risk management processes have been developed at either the CCJS divisional or the individual program level for the identification and assessment of the risks to the quality of CCJS' data and data products. Additionally, no processes have been established for the development of risk mitigation strategies or the ongoing monitoring of both the risks and the effectiveness of the risk mitigation strategies by divisional management.

Through the development of the National Data Requirements document, a field interpretation document, and the UCR Survey Incident-Based Survey Manual; the creation of a UCR National Training Officer; and the ongoing relationship and communications with data providers, the Policing Services and Courts programs are adequately supporting the training and guidance needs of their data providers.

To improve the reliability, completeness and accuracy of the data, CCJS requests that data providers validate their data sets annually before the production of Justice Statistics publications through either the agency's The Daily release process or CCJS' own process, Juristat. The audit team noted that data providers are consistently signing-off on the reasonability of the data, and when possible, its accuracy and completeness.

Since both the UCR Survey and ICCS rely solely on the reception of administrative data from data providers, there is a need for the effective cleansing and verification of the data received through the use of SAS-based edit and imputation programs. Data cleansing and verification processes help detect and correct (or remove) corrupt, duplicate or inaccurate records from a record set. For both the UCR Survey and ICCS, edit and imputation programs have been developed in conjunction with Statistics Canada's Methodology Branch to achieve the level of quality required for the purpose of the data. The audit team noted opportunities to further strengthen the expectation of follow-up activities relative to the edits, imputations, warnings and trends for the ICCS and the documentation of follow-up activities performed with data providers for both surveys.

Given that a representative from Statistics Canada's Operations and Integration Division is solely responsible for the data entry of responses received by paper from the UCR1 respondents, the audit team noted further opportunities to strengthen this data entry process through the inclusion of a peer or supervisory data verification review.

Adequate internal and external review processes have been established to identify potential data errors within draft data products in a timely manner; however, the audit noted opportunities to improve the documentation maintained as evidence of these reviews and how issues and errors were addressed.

The audit also noted that, based on the evidence obtained, CCJS is addressing and communicating errors identified post-review (either pre- or post-release), and that the Statistics Canada Directive on Corrections to Daily Releases and Statistical Products is followed when addressing or escalating errors identified on production day, as well as post-release.

Overall conclusion

Statistics Canada has established an adequate governance framework with elements in place to support the quality of Justice Statistics data. The effectiveness of these frameworks could be further improved by formalizing the roles, responsibilities and accountabilities, including the interdependencies of key CCJS personnel, as well as the external stakeholders that make-up the governance committees. Further, the establishment of a robust risk management framework that periodically identifies and assesses current divisional and program-level risks would strengthen the mitigation of risks that could affect the quality of data produced by CCJS.

The audit noted that effective control mechanisms have been established and are consistently applied to ensure the release of quality data in accordance with the agency's Quality Guidelines. Although adequate guidance and training is provided to obtain administrative data from the CCJS data providers, along with a data cleansing and validation process in place, the existing verification processes varied between the CCJS programs audited. Similarly, even though there is evidence of an effective review process to identify errors with data products, it is not being consistently maintained or documented.

Conformance with professional standards

The audit was conducted in accordance with the Internal Auditing Standards for the Government of Canada, which includes the Institute of Internal Auditors and the International Standards for the Professional Practice of Internal Auditing.

Sufficient and appropriate audit procedures have been conducted and evidence gathered to support the accuracy of the findings and conclusions in this report and to provide an audit level of assurance. The findings and conclusions are based on a comparison of the conditions, as they existed at the time, against pre-established audit criteria. The findings and conclusions are applicable to the entity examined and for the scope and time period covered by the audit.

Patrice Prud'homme
Chief Audit Executive

Introduction

Background

Since 1981, the federal, provincial and territorial deputy ministers responsible for the justice system in Canada have been working together, along with the Chief Statistician of Canada, on the National Justice Statistics Initiative (NJSI). The mandate of the NJSI is to provide information to the justice community and the public on the nature and extent of crime and victimization, as well as on the administration of criminal and civil justice in Canada.

Through the Canadian Centre for Justice Statistics (CCJS), a division of Statistics Canada, the efforts of the NJSI are directed toward the production of data to inform the legislative, policy, management and research agendas of federal–provincial–territorial partners, and to inform the public. The focus of the CCJS is the development, collection, integration and analysis of data that reflect trends in Canada and on the development of national- and jurisdictional-level indicators.

There are three statistical programs within the CCJS: the Policing Services Program, the Courts Program and the Correctional Services Program. Each program is responsible for the following flagship survey:

  • Policing Services Program: The Uniform Crime Reporting (UCR) Survey collects data from police services across Canada on incidents, victims and persons accused of police-reported crime, including hate crimes.
  • Courts Program: The Integrated Criminal Court Survey (ICCS) collects detailed microdata on every appearance in adult criminal and youth court, which are aggregated into charges and cases.
  • Correctional Services Program: The Integrated Correctional Services Survey collects microdata on adults and youth under the responsibility of the federal and provincial–territorial correctional systems.

In addition to these three programs, CCJS also includes other groups and units:

  • The Analysis Section is responsible for working with members of the Liaison Officers Committee of the National Joint Statistics Initiative (LOCNJSI) to undertake analysis on special topics of interest to the NJSI. This section also undertakes analysis on a cost-recovery basis for initiative partners and for other government departments on justice-related issues.
  • The Data Access and Dissemination Unit works with program areas and other areas within Statistics Canada to increase access to CCJS data products, such as making justice data available through the research data centres.
  • The Data Development Unit (formerly the Corporate and Special Projects Unit) is responsible for working with members of LOCNJSI on a multi-year project to examine re-contact with the justice system. This project will create a series of sector-specific indicators that will give policy makers and researchers information on the extent of re-arrest, re-conviction and re-involvement with the policing, courts and corrections sectors in Canada, as well as the capacity to analyze cross-sectoral factors that influence re-contact. It is also responsible for a number of other projects to maintain the relevancy of the reports produced by CCJS in future years.
  • Outside of the work performed by the agency's Quality Secretariat, which exists to promote sound quality management practices for Statistics Canada program areas based on the agency's Quality Guidelines, the Data Quality Secretariat (DQS) was recently created within CCJS to carry out data-quality evaluations and to support its program areas in enhancing data-quality procedures. The intent of the DQS is to continue to evaluate the quality of data, assess key risks and document best practices for the various CCJS data programs; develop an implementation plan for addressing key areas of vulnerability with respect to data quality; and begin to implement recommended data quality initiatives.

The three CCJS programs rely on the reception of administrative data by justice partners, including policing services, courts and correctional facilities—referred to as 'data providers.' To ensure the quality of CCJS data and data products, a number of governance committees and working groups—comprising key CCJS personnel, data providers and other judicial stakeholders—have been established. For example, the LOCNJSI oversees the work of CCJS on behalf of the NJSI. The members of the LOCNJSI include departmental officials appointed by deputy ministers, the Statistics Canada director general responsible for CCJS, and a representative of the Canadian Association of Chiefs of Police. The committee provides insight into topics of interest within the community, reviews work-in-progress Juristats and other reports, and provides a communication channel between the data providers and CCJS to discuss data quality issues.

Audit objectives

The objectives of the audit were to provide the Chief Statistician and the Departmental Audit Committee with assurance that

  • Statistics Canada has established an adequate governance framework to support the quality of Justice Statistics data
  • effective control mechanisms have been established and are consistently applied to ensure the release of quality data in accordance with the agency's Quality Guidelines.

Scope

The scope of this audit included an examination of the adequacy and effectiveness of the quality assurance framework over Justice Statistics. Specific areas examined included whether

  • governance, roles, responsibilities and accountabilities for the quality of data within CCJS are clear and well communicated
  • management identifies and assesses the risks that may preclude the achievement of quality products
  • effective control mechanisms have been established and are consistently applied to ensure the quality of data from data providers for the ICCS and UCR Survey, as well as for the quality of information released as part of the Juristat and The Daily release processes.

The scope of the audit work originally included an assessment of the activities related to the production of Justice Statistics products from January 1, 2014, to December 31, 2014. This scope was extended to also include an assessment of the activities related to the production of Justice Statistics products pertaining to the Courts Program from January 1, 2013, to December 31, 2013, as the Courts Program has not processed any data since 2013 because of its current transition to the Social Survey Processing Environment.

The audit criteria for this audit are presented in Appendix A.

Approach and methodology

The audit work consisted of a comprehensive review and analysis of relevant documentation, interviews with key management and staff, and testing to assess the effectiveness of processes in place.

The field work included a review and testing of CCJS processes and procedures in place related to the quality and accuracy of the Justice Statistics information.

The audit focused on the assessment of the key controls and processes established for two of the three key CCJS surveys: the UCR Survey and the ICCS.

This audit was conducted in accordance with the Internal Auditing Standards for the Government of Canada, which includes the Institute of Internal Auditors' International Professional Practices Framework.

Authority

The audit was conducted under the authority of the approved Statistics Canada integrated Risk-Based Audit and Evaluation Plan 2014/2015 to 2018/2019.

Findings, recommendations and management response

Objective 1: Statistics Canada has established an adequate governance framework to support the quality of Justice Statistics data.

Governance in support of the quality of Justice Statistics

Roles, responsibilities and accountabilities of the key personnel responsible for the quality of Justice Statistics are well understood within the Canadian Centre for Justice Statistics (CCJS) and by key external stakeholders. Several committees have been established to monitor the quality of CCJS data and formal terms of reference outlining their roles and responsibilities have been developed. However, a formal document, such as a CCJS accountability and governance framework, has not been developed to provide an overview of all of the key roles, responsibilities and accountabilities within the CCJS directorate, as well as the roles and responsibilities of governance committees and other stakeholders, including their interdependencies.

A CCJS Data Quality Secretariat was established in 2014, and although it has been working on completing a number of quality reviews, their role, responsibilities and accountabilities outside of this type of direct reporting structure within the CCJS has not been formally defined.

Given the number of internal and external stakeholders, a robust governance framework is essential to ensure the quality of Justice Statistics. Roles, responsibilities and accountabilities for the quality of data within the Canadian Centre for Justice Statistics (CCJS), including the Data Quality Secretariat (DQS), should be clearly documented and well understood. As well, appropriate and effective oversight bodies should be established to monitor the quality of CCJS data, as it pertains to the internal and external environments of CCJS.

Roles, responsibilities and accountabilities are well understood within CCJS and by key external stakeholders; however, opportunities exist to improve the formalization of their roles, responsibilities and interdependencies

A key component of governance structures is the establishment of roles, responsibilities and accountabilities of stakeholders. The audit team noted that although the roles, responsibilities and accountabilities of the key personnel responsible for the quality of Justice Statistics seem appropriate and well understood within CCJS and by key external stakeholders, they are not formally documented. Rather, the only documentation outlining roles and responsibilities at the personnel-level are job descriptions, which are either generic and do not fully capture all of the roles and responsibilities of key personnel, or are out of date.

The governance structure for CCJS is unique in that it reports not only to internal stakeholders, but also to external stakeholders, through CCJS' membership in the National Justice Statistics Initiative (NJSI), which works with CCJS to produce useful information to inform the legislative, policy, management and research agendas of federal–provincial–territorial partners, and the public. These external stakeholders are important partners in ensuring the quality of CCJS' data and data products. As a direct result of the NJSI, several governance committees have been established to monitor the quality of CCJS data. These governance committees include the Liaison Officers Committee of the National Justice Statistics Initiative and the Police Information and Statistics (POLIS) Committee (a subcommittee of the committee for the Canadian Association of Chiefs of Police) and include representatives of the relevant CCJS programs and external stakeholders, such as government and data-provider representatives. The committees' membership structure fosters a consistent understanding of data product quality expectations, data definitions and jurisdictional data quality requirements, among others. Although formal terms of reference have been developed for the various governance committees, there is no formally documented accountability framework outlining the various roles, responsibilities and accountabilities of all governance committees, including their reporting structure and other interdependencies. Since many of the key personnel within CCJS have been in place for some time and have considerable organizational knowledge, there has not been a perceived need for documenting this.

However, because CCJS does have a unique and interlinked governance framework that includes oversight at both the departmental level and oversight through the various external partner organizations, there is the potential for overlap or gaps in coverage without a formal accountability framework in place. This would also provide a mechanism to ensure that key personnel and external stakeholders are aware of their responsibilities and accountabilities regarding data quality.

The roles, responsibilities and accountabilities for the CCJS' Data Quality Secretariat have not yet been formally defined and communicated

The DQS was created in 2014 for CCJS to address quality concerns stemming from the fact that specific JuristatsFootnote 1 had to be retracted because of data errors. The DQS, which is part of CCJS' Data Access and Dissemination Unit, is described in the 2015/2016 CCJS Operational Plan as a "new secretariat [that] has been formed to carry out data quality evaluations and to support program areas in enhancing data quality procedures."

Although the DQS has been working on the completion of a number of quality reviews within CCJS, their role, responsibilities and accountabilities outside of this type of reporting have not yet been well-defined or documented. As well, interviewees noted confusion regarding the role of CCJS' DQS versus that of the agency-wide Quality Secretariat, which recently performed a quality review of CCJS' Courts Program. According to the Statistics Canada intranet, the agency's Quality Secretariat is dedicated to supporting the development and implementation of policies and procedures that promote sound quality management practices, designing and managing studies related to quality management, and providing advice and assistance to program areas on quality management.

The DQS currently comprises only the chief of the unit, and no other positions within that unit have been staffed. In the interim, the work undertaken by the unit is currently being performed by the chief and representatives from the Data Access and Dissemination Unit.

Without the formal documentation of the expected role and responsibilities of the DQS, there is an increased risk that they may not be meeting the expectations of both senior management and stakeholders in improving data quality throughout the CCJS division. Additionally, there is an increased risk of duplication of efforts, especially with that of the agency's Quality Secretariat. The development of an accountability and governance framework, as mentioned above, would provide an opportunity to outline the roles and responsibilities of the DQS, as well as to formalize how the DQS' role differs from that of the agency's Quality Secretariat.

Recommendations

It is recommended that the assistant chief statistician, Social, Health and Labour Statistics, ensure that

  • a formal accountability and governance framework is developed that outlines the various roles, responsibilities and accountabilities within CCJS and of external stakeholders (including governance committees) responsible for CCJS data quality, and their interdependencies. This accountability framework should also include the role, responsibilities and accountabilities of the DQS.

Management response

Management agrees with the recommendation.

  • The DQS and program chiefs will jointly prepare an accountability framework to be approved by the director, director general and assistant chief statistician responsible for the CCJS, which outlines the internal and external stakeholders' roles and responsibilities, the various committees and their interdependencies. To this end, CCJS will build on existing terms of reference of the NJSI and that of the POLIS to harmonize and improve governance and accountability so that roles and responsibilities as well as interdependence are clear.
  • A terms of reference document for the CCJS' DQS will also be prepared for approval by the director, director general and assistant chief statistician responsible for CCJS. The terms of reference will outline the roles and responsibilities of the DQS and its membership.

    Deliverables and timeline: Both a formalized accountability framework and the terms of reference documents are to be completed by June 2016.

Risk management in support of the quality of Justice Statistics

Although high-level risks have been identified by the division for the development of the agency's Corporate Risk Profile, the division's risk register has not been updated since March 2012.

No formal risk management processes have been developed at either the Canadian Centre for Justice Statistics (CCJS) divisional or the individual program level for the identification and assessment of the risks to the quality of CCJS' data and data products. Additionally, no processes have been established for the development of risk mitigation strategies or the ongoing monitoring of both the risks and the effectiveness of the risk mitigation strategies by divisional management.

An effective risk management framework needs to be established to identify, assess, mitigate and monitor the potential and emerging risks that may affect the quality of the Canadian Centre for Justice Statistics' (CCJS') data and data products.

No formal risk management process exists at either the CCJS divisional or individual program level to identify and assess risks posed to the quality of data

CCJS management participates in the agency's risk management exercise, which is to periodically identify and assess the high-level risks facing the achievement of their objectives as part of the Corporate Risk Profile exercise. However, no formal risk management processes have been established to regularly identify and assess CCJS' divisional or program-level risks.

Although various documents outline high-level risks facing the CCJS division, the detailed risk register for the division has not been updated since March 2012. The chiefs and the director of CCJS are in the process of reviewing and updating the risk register to reflect the risks currently facing the three CCJS programs; however, at the time of the audit, this exercise had not been completed.

The audit further noted that, although certain risks and issues are being informally mitigated and monitored for each program, no formal process has been established for developing divisional or program-level risk mitigation strategies and their monitoring. The previous risk register, dated March 2012, provides a rating for the effectiveness of current mitigation strategies; however, the risk register does not include specific information on what these mitigation strategies are or how they will address the identified risks. Additionally, no documentation has been maintained as evidence of the ongoing monitoring of the identified risks and the effectiveness of the risk mitigation strategies.

Without a formal process in place for the systematic and ongoing identification and assessment of the risks to the quality of CCJS data and its related products, there is an increased possibility that these potential risks will not be appropriately escalated and mitigated in a timely manner by the (identified) appropriate owner of the risk or risks.

Recommendations

It is recommended that the assistant chief statistician, Social, Health and Labour Statistics, ensure that

  • the newly formed Data Quality Secretariat (DQS) take on the role of maintaining a formal risk management process for the CCJS that periodically identifies and assesses divisional and program-level risks, as well as handles the development of risk mitigation strategies for continuously monitoring their effectiveness.

Management response

Management agrees with the recommendation.

  • The director of CCJS will ensure that the DQS will work with the CCJS program areas (Policing, Courts and Corrections) to identify risks and prioritize risk mitigation strategies and action plans for approval by the director, director general and assistant chief statistician responsible for CCJS. The risk profile and mitigation action plans will be reviewed and updated annually as part of CCJS' operational planning. The development of risk mitigation strategies and the monitoring of their effectiveness will rest with the CCJS program areas as part of the survey operation.

    Deliverables and timeline: The CCJS risk profile is to be completed by January 2016. A formalized document integrating the risk mitigation plan as part of CCJS' operational planning process is to be completed by March 2016. The effectiveness of the mitigation plan will be reflected within the CCJS' risk profile on an ongoing basis.

Objective 2:Effective control mechanisms have been established and are consistently applied to ensure the release of quality data in accordance with the agency's Quality Guidelines.

Control mechanisms over data processing

Through the development of a National Data Requirements document and a Uniform Crime Reporting (UCR) Incident-Based Survey Manual; the creation of a UCR National Training Officer; and the ongoing relationship and communications with data providers, the Policing Services and Courts programs are adequately supporting the training and guidance needs of their data providers.

To improve the reliability, completeness and accuracy of the data, the Canadian Centre for Justice Statistics requests that data providers validate their data sets annually before the production of Justice Statistics publications. The audit team noted that data providers are consistently signing-off on the reasonability of the data, and when possible, its accuracy and completeness.

Since both the UCR Survey and the Integrated Criminal Court Survey (ICCS) rely solely on the reception of administrative data from data providers, there is a need for the effective cleansing and verification of the data received through the use of SAS-based edit and imputation programs. Data cleansing and verification processes help detect and correct (or remove) corrupt, duplicate or inaccurate records from a record set. For both the UCR Survey and ICCS, edit and imputation programs have been developed in conjunction with Statistics Canada's Methodology Branch to achieve the level of quality required for the purpose of the data. The audit team noted opportunities to further strengthen the expectation of follow-up activities relative to the edits, imputations, warnings and trends for the ICCS survey and the documentation of follow-up activities performed with data providers for both surveys.

Given that a representative from Statistics Canada's Operations and Integration Division is solely responsible for the data entry of responses received by paper from the UCR1 respondents, the audit team noted further opportunities to strengthen this data entry process through the inclusion of a peer or supervisory data verification review.

The Statistics Canada Quality Guidelines note that when using administrative data, such as those that are relied upon by the Canadian Centre for Justice Statistics (CCJS), the key quality indicators are that data are relevant, accurate, timely and coherent. Controls over the processing of the data received from data providers ensure that the data meet these quality indicators. Adequate guidance and training should be provided to data providers to support the collection of reliable, complete, accurate and timely data and channels should be established for data providers to communicate any changes to their systems and processes or any changes to administrative data definitions that affect the data provided to CCJS. Effective data cleansing and verification processes should also have been established to ensure the reliability and relevancy of the data collected, and an effective validation process should have been established to confirm the reliability, completeness and accuracy of the data provided.

Adequate guidance and training is provided to data providers to support the collection of reliable, complete, accurate and timely data

In conjunction with data providers, CCJS has developed National Data Requirements for each of the surveys reviewed. They outline the approved data definitions and parameters for each data field required for both the Uniform Crime Reporting (UCR) Survey and the Integrated Criminal Court Survey (ICCS), and are relied on by data providers and program staff to ensure a consistent understanding of the data definitions. In addition to the National Data Requirements, the Policing Services Program has also developed a UCR Incident-Based Survey Manual to provide additional guidance to data providers in the discharge of their responsibilities relative to the quality of the data they provide.

Because of the significant number of data providers (over 100 covering over 1,000 policing service detachments) they rely on, the Policing Services Program has further developed standardized training, which is available to all data providers, and created the role of National Training Officer, whose main responsibilities include providing guidance, support and training to data providers.

Given the limited number of data providers and the variety of information systems used to collect and transfer data to CCJS, the Courts Program has not developed standardized training for data providers outside of the National Data Requirements. Rather, the program relies on their ongoing relationship with their 13 data providers and their participation on the Liaison Officers Committee of the National Joint Statistics Initiative (LOCNJSI) Working Group on ICCS Data Quality to provide ongoing guidance and support to data providers.

The audit noted that both the Policing Services Program and the Courts Program have established adequate channels for data providers to communicate any changes to either their systems and processes or administrative data definitions that affect the data provided to CCJS. Specifically, the audit noted that both programs have mechanisms in place to foster strong working relationships with data providers through the LOCNJSI, the Police Information and Statistics Committee and various subcommittees of each, as well as through ongoing contact with individual data providers.

Adequate data cleansing and verification processes have been established to ensure the reliability and relevancy of the data collected; however, verification practices between programs varied

Since both the UCR Survey and ICCS rely solely on the reception of administrative data from data providers, there is a need for the effective cleansing and verification of the data received through the use of edit and imputation programs. Data cleansing and verification processes help detect and correct (or remove) corrupt, duplicate or inaccurate records from a record set. For both the UCR Survey and ICCS, edit and imputation programs have been developed in conjunction with Statistics Canada's Methodology Branch to achieve the level of quality required for the purpose of the data.

Specific to the UCR Survey, two types of data providers exist. They are

  • UCR1: Respondents are unable to provide digital microdata because of systems that are incompatible with CCJS' systems, and instead provide aggregate counts by paper to CCJS
  • UCR2: Respondents generally use one of two data management systems that are compatible with CCJS' systems, and provide microdata to the program via a data extract from their system.

In 2011, to leverage the expertise within Statistics Canada, the agency's Operations and Integration Division (OID) started playing a role in processing both UCR1 and UCR2 data. Specifically, OID is responsible for the data entry of the aggregate counts received by paper by UCR1 respondents and the processing of the data received by UCR2 data providers.

Based on a sample of 5 UCR1 and 10 UCR2 selected respondents, it was noted to be a general practice that the data be manually captured by a representative from OID. Although the audit team noted that some controls have been embedded within the data entry process, such as batch total checks, no peer or supervisory data entry verification is being performed. Given that UCR1 respondents represent only a small percentage of data providers (approximately 4%), the conduct of a peer or supervisory data entry verification could help decrease the risk of data integrity issues going undetected.

UCR2 data are processed using a SAS-based edit and imputation program. This program performs data cleansing and verification edits and imputations, and highlights fields that fall outside of expected parameters for verification purposes. The processing system will then generate reports logging any edits, imputations or warnings resulting from the processing of the data, called an Edit and Imputation (E & I) report. Because deletions, edits and imputations performed on the data records can affect the overall quality of the data, the Policing Services Program indicated that follow-up with data providers is deemed critical in addressing certain potential data integrity issues.

The audit team was able to review evidence of the conduct of follow-up activities performed by the program and OID; however, varying levels of documentation were provided for 7 out of the 10 E & I reports sampled, therefore follow-up activities are not being consistently captured and a standardized process or template for UCR analysts to track the status of these follow-ups with data providers has not yet been developed. Further, there is currently no requirement for management within the program to review the actions undertaken by the analysts when conducting this follow up with data providers.

Without the consistent and formal tracking of follow-up activities undertaken with UCR2 data providers based on the results of the E & I reports generated by the processing system, and without the review by CCJS program management of the activities undertaken by the analyst to address data issues, there is an increased risk that follow-up activities may not be performed or issues not remediated per expectations.

The ICCS data, similar to UCR2, are received electronically in the form of microdata, and then processed through a SAS-based program. Again, similar to the UCR2 data, reports are automatically generated by the system to provide CCJS with a summary of the edits and imputations performed by the program and any subsequent warnings that are flagged.

When deemed material, analysts are following up on the edits, imputations and warnings performed or flagged by the system with the ICCS data providers. The program indicated that, historically, deleted or imputed records within ICCS data have represented only a small percentage of received data records. The audit team noted, however, that no formal parameters (i.e., threshold numbers or percentages) have been established for the Courts Program team responsible for the ICCS to trigger when processing edits, imputations, warnings or trends should be investigated further. Rather, the need for follow-up activities is left at the discretion of the responsible analyst, based on the review of a tracking spreadsheet of the number and type of edits, imputations and warnings that were generated by the system for each data provider in past years.

Without the establishment of formal parameters (numbers and/or percentages) for the investigation of ICCS processing edits, imputations, warnings and trends, there is an increased risk that significant data integrity issues may not be investigated or key data may be deleted from the population, affecting the quality of data products.

An effective validation process has been established to confirm the reliability, completeness and accuracy of the data provided

To improve the reliability, completeness and accuracy of the data, CCJS requests that data providers validate their data sets annually before the production of Justice Statistics publications. As part of the process, CCJS will prepare data tables outlining the data provider's complete data set for the year, along with explanations of the tables provided and their significance. Data providers are then given a period of time to review the data packages and to provide the applicable CCJS program with any corrections or explanations for unexpected variances. Once these changes are made, data providers are expected to sign-off on the data set.

The audit team noted that data providers are consistently signing off on the reasonability of the data, and when possible, its accuracy and completeness. Based on a sample of 18 data providers for the UCR Survey and ICCS, all of the annual validation packages were found to be signed off for the year in question.

Recommendations

It is recommended that the assistant chief statistician, Social, Health and Labour Statistics, ensure that

  • a standardized process, with relevant tools, is established for the UCR2 as it pertains to the tracking of all edits, imputations and warnings outlined in the E & I reports, including the documentation of consultations with data providers
  • Formal parameters are established for the investigation of ICCS data edits, imputations, warnings and trends. These parameters should be established based on the impact of a change on each variable on the survey.

Management response

Management agrees with the recommendations.

  • The UCR survey manager will develop a reference document and log template so that analysts are documenting data quality issues and follow-ups with respondents consistently. This reference document and log template will be approved by the chief of the Policing Services Program.

    Deliverables and timeline: Both formalized documents, the reference document and the log template are to be completed by March 2016.
  • The ICCS survey manager and methodologists are to prepare documentation on the formal parameters and processing reports, which is to be approved by the chief of the Courts Program. The documents will show the formal parameters to monitor edits, imputations and data verification for the ICCS.

    Deliverables and timeline: Formalized documentation for the ICCS is to be completed by fall 2017.

Control mechanisms over the production of Justice Statistics products

Adequate internal and external review processes have been established to identify potential data errors within draft data products in a timely manner; however, the audit noted opportunities to improve the documentation maintained as evidence of these reviews and how issues and errors were addressed.

The audit noted that although some evidence was obtained demonstrating that the Canadian Centre for Justice Statistics is addressing and communicating errors identified post-review (either pre- or post-release), no formal processes highlighting the steps to be followed and the documentation to maintain when addressing or escalating errors identified post-review have been established.

Effective control mechanisms over the production of Justice Statistics data products help enable the Canadian Centre for Justice Statistics (CCJS) to meet stakeholders' quality expectations. In accordance with the Statistics Canada Quality Guidelines, it is critical to have controls in place to ensure that the data released by CCJS are relevant, accurate, timely and coherent. To achieve this, an effective review process should be established to identify and address potential data errors in data products in a timely manner. Additionally, an effective framework should be established to ensure that any data errors identified post-review are also communicated and addressed in a timely manner.

An effective review process has been established to identify data errors within draft data products in a timely manner; however, evidence of the conduct of the internal and external reviews is not being consistently maintained

Once all data providers have signed-off on the data sets, both the Courts Program and the Policing Services Program use these validated data sets to create data products (via Juristat and The Daily). CCJS program analysts are responsible for creating the data products and, once completed, the data products undergo a robust internal review process, which includes (1) a peer review whereby all of the data are recalculated and analyzed; (2) a review by the survey manager and program chief, which includes questioning the analysis undertaken by the analyst and any unexpected data trends; and (3) reviews by the chiefs of the other CCJS programs and the director or director general for overall reasonableness of the data and consistency of their appearance with other CCJS data products.

In addition to the internal review process, to improve the reliability of the data reported, the data products are also submitted to external stakeholders, including data providers for review and validation. Specifically, the Integrated Criminal Court Survey (ICCS) Juristat is submitted to the associated liaison officers (LOs) and the Uniform Crime Reporting (UCR) Juristat is submitted to Police Information and Statistics Committee representatives and the representing LOs to validate that the information enclosed in the publication appears accurate.

Although the audit team is satisfied with the level of internal and external reviews, it was noted that an inconsistent level of documentation is maintained to demonstrate the conduct of the internal and external reviews, including how issues and errors have been addressed and resolved.

No internal formal processes have been established to communicate and remediate data errors identified post-review (pre- and post-release)

The processes to be followed in the event of the identification of an error post-review (both pre- and post-release) have not been formally documented. Rather, the audit noted that informal processes are being followed when an error is identified internally or communicated by external stakeholders. Specifically, Statistics Canada's Dissemination Division has developed an error log to track the errors discovered in released publications. This error log includes the results of an assessment of the error's impact, if the data have been corrected, and the reload date of the publication. However, the audit noted that for identified errors, inconsistent documentation is maintained to indicate how these errors were communicated, documented and addressed. For some identified errors, a communications plan was developed and maintained to describe the error and its impact, as well as to compare the original data to the revised data and provide a strategy for correcting the error and communicating the impact to the applicable stakeholders. However, for other errors, no documentation evidencing the internal communication and escalation of the issue was maintained.

Once errors have been corrected and publications have been reloaded on the Statistics Canada external website, an explanation note is included on the website stating the nature of the error that has been corrected and the date the publication was reloaded.

The audit also noted that CCJS is addressing and communicating errors identified post-review, and that the Statistics Canada Directive on Corrections to Daily Releases and Statistical Products is followed when addressing or escalating errors identified on production day, as well as post-release.

Recommendations

It is recommended that the assistant chief statistician, Social, Health and Labour Statistics, ensure that

  • pre-release internal and external review processes, as well as the expectations of the reviewer in reviewing data products, should be formally documented
  • adequate documentation should be maintained to demonstrate the conduct of both the internal and external reviews of the data products. This could include a Quality Assurance sign-off checklist to be completed by all management levels before data are released.

Management response

Management agrees with the recommendation.

  • The director of CCJS will ultimately ensure that a one-page document on reviewer expectations is developed so that CCJS stays up-to-date on the authority and acknowledgment of confidentiality for advance release for data validation. The discussion on the expectations of the reviewer, who is most likely the LO of the National Justice Statistics Initiative, will also take place during the orientation of new LOs.

    Deliverables and timeline: A formalized one-page document is to be completed by fall 2015.
  • The author of each Juristat report will be responsible for preparing the log-sheet outlining the comments of the internal and external reviewers and CCJS' responses to the comments. These log-sheets will be approved by the chief of the Analysis Unit.

    Deliverables and timeline: Log-sheets will be created and approved continually as reports are published via Juristat.

Appendices

Appendix A: Audit criteria

Appendix A: Audit criteria
Control objective, core controls and criteria Sub-Criteria Policy Instrument
Objective 1: Statistics Canada has established an adequate governance framework to support the quality of Justice Statistics data.
1.1 Roles, responsibilities and accountabilities for the quality of data within the Canadian Centre for Justice Statistics (CCJS) are clear and well communicated. 1.1.1 Roles, responsibilities and accountabilities for key personnel responsible for the quality of Justice Statistics have been clearly documented and are well understood.

1.1.2 The role, responsibilities and accountabilities of the Data Quality Secretariat in supporting the quality of data within CCJS have been clearly documented and are well understood.

1.1.3 Appropriate and adequate oversight bodies have been established to monitor the quality of CCJS data.
Management Accountability Framework (MAF) – Core Management Control

Statistics Canada Quality Guidelines
1.2 Management identifies and assesses the risks that may preclude the achievement of quality as an objective for the CCJS. 1.2.1 Formal processes and guidelines exist and are applied to facilitate the identification and assessment of risks to the quality of Justice Statistics data.

1.2.2 Risk mitigation strategies have been developed to address key risks and are monitored continuously for effectiveness.

1.2.3 Employees are aware of values and ethics directives and understand who and where to report potential wrongdoing.
MAF – Core Management Control

Statistics Canada Quality Guidelines
Objective 2: Effective control mechanisms have been established and are consistently applied to ensure the release of quality data in accordance with the agency's Quality Guidelines.
2.1 Effective control mechanisms have been established and are consistently applied to ensure the quality of data from data providers for the Integrated Criminal Court Survey and Uniform Crime Reporting Survey. 2.1.1 Adequate guidance and training is provided to data providers to support the collection of reliable, complete, accurate and timely data.

2.1.2 Channels have been established for data providers to communicate any changes to their systems and processes or any changes to administrative data definitions affecting the data provided to CCJS in a timely manner.

2.1.3 Effective data cleansing and verification processes have been established to ensure the reliability and relevancy of the data collected.

2.1.4 An effective validation process has been established to confirm the reliability, completeness and accuracy of the data provided.
MAF – Core Management Control

Statistics Canada Quality Guidelines
2.2 Effective control mechanisms have been established and are consistently applied to ensure the quality of information released as part of Juristat and The Daily release processes. 2.2.1 An effective review process has been established to identify data errors in a timely manner.

2.2.2 Edits proposed by Communications Division are reviewed in a timely manner.

2.2.3 An effective framework has been established to ensure that data errors identified post-review are communicated and addressed in a timely manner.
MAF – Core Management Control

Statistics Canada Quality Guidelines

Appendix B: Acronyms

Appendix B: List of acronyms
Acronym Description
CCJS Canadian Centre for Justice Statistics
DQS Data Quality Secretariat (CCJS)
E & I Edit and imputation (report)
ICCS Integrated Criminal Court Survey
LO Liaison officers
LOCNJSI Liaison Officers Committee of the National Joint Statistics Initiative
MAF Management Accountability Framework
NJSI National Justice Statistics Initiative
NTO National Training Officer
OID Operations and Integration Division
POLIS Police Information and Statistics Committee
UCR Uniform Crime Reporting (Survey)

Note:

Footnote 1

Juristat is a publication that provides in-depth analysis and detailed statistics on a variety of justice-related topics and issues. Topics include crime; victimization; homicide; civil, family and criminal courts; and correctional services. It is intended for those with an interest in Canada's justice system as well as those who plan, establish, administer and evaluate justice programs and projects.

Return to footnote 1 referrer

Date modified: