AgZero: Using alternative data and advanced technologies to reduce response burden on farmers

Like other data users, farmers want timely, accurate and detailed data, while completing the least number of traditional surveys. That is why in April 2019, Statistics Canada set a goal to move beyond a survey-first approach by replacing survey data with data from administrative sources.

This project, dubbed AgZero, is using alternative data sources and advanced technologies, such as Earth Observation data and machine learning, to reduce the response burden on farmers to as close to zero as possible by 2026. Through this process, Statistics Canada will continue to provide the same high-quality information, while applying the same rigorous privacy and confidentiality standards that Canadians expect and deserve.

By 2026, farmers will spend less time answering survey questions.

Early milestones include:

  • In July 2019 and March 2020, Statistics Canada produced estimates on the number of temporary foreign workers in the agriculture sector in Canada using administrative data. The estimates were produced with zero direct contact with farmers, saving them valuable time.
  • The agency implemented a new crop yield model for the July 2019 Field Crop Survey in Manitoba using satellite imagery and administrative data. This resulted in fewer survey questions for respondents in that province. The goal is to expand this model to as many provinces as possible by 2022, depending on the availability of administrative data.
  • In April 2020, Statistics Canada used administrative data to produce annual estimates of the total number of employees in the agriculture sector without having to ask farmers to complete questionnaires.

A history of trusted agriculture statistics

Since the time of the first census in 1921, Canada's national statistical office has collected, analyzed and reported on agriculture in Canada so that – together – we can better understand ourselves and our country. As the agency continues to modernize and chart new methods of collecting data, we are committed to protecting the rightful privacy of Canadians' information. It is our duty by law.

How AgZero keeps your information safe and private

Our AgZero project follows the same rigorous privacy and confidentiality standards as all other statistical programs at Statistics Canada. All collected information is anonymized: this means that data that is made public can never be connected to you, your household or business.

The project also applies innovative methods to preserve security, privacy and confidentiality, including the agency's Necessity and Proportionality Framework. It helps ensure that the agency's need for data is well-defined, and that we work to balance the volume and sources of data with the need to reduce the response burden on Canadians—all while maintaining the protection of their privacy. For more, check out Statistics Canada's Trust Centre.

Learn more about AgZero

As part of our commitment to engagement and transparency, Statistics Canada's Agriculture Statistics Program provides regular updates on the AgZero initiative to key stakeholders. These include Agriculture and Agri-Food Canada, provincial and territorial ministries of agriculture, and key industry groups.

Stay up-to-date on the latest news by following Statistics Canada's social media channels, or by registering for My StatCan agriculture updates.

You can read more about AgZero in the StatCan Blog: Reducing the response burden imposed on farmers and business.

Do you have any questions about our AgZero project? If so, contact: AgZero.

The road to AgZero

The road to Agzero

Modernization projects

Statistics Canada fosters a culture of innovation—it is at the heart of everything we do. Our modernization initiative is based on five key pillars, which were developed in collaboration with our stakeholders in a series of consultations to better understand their information needs. These consultations, coupled with the lessons learned from our four pathfinder projects, are helping transform Canada’s national statistical agency into one that is even more modern and responsive to our data-driven world.

Today, we know that traditional statistics-gathering methods are no longer sufficient to accurately measure Canada’s economic and societal changes. That is why Statistics Canada’s focus has shifted toward leveraging administrative data, using advanced technologies and developing new, cost-effective methods to link and integrate data from a variety of sources.

As we experiment with new methods, we will continue to apply the same rigorous privacy and confidentiality standards to protect Canadians’ information. It is our responsibility by law.

Our modernization projects

Learn about some of our latest initiatives that are driving our modernization forward.

Audit of Acquisition of Data from Alternative Sources

Audit Report

November 2019
Project Number: 80590-112

Executive summary

Statistics Canada works with all levels of government and private sector organizations in the collection and compilation of statistical information. Although the agency has been acquiring data from alternative sources for decades, Statistics Canada data collection has traditionally been rooted in the administration of surveys. The agency has acknowledged that the traditional method of collecting data presents a series of unique challenges for meeting new and ongoing data needs and reducing response burden.

As part of the agency's modernization initiative that was launched in 2017, the agency is moving towards an "administrative data first agenda." This agenda seeks to use the acquisition of data from alternative sources as the primary method to collect statistical information in order to improve the balance between data quality, response burden and costs.

For the purpose of this audit, "data from alternative sources" are defined as all data other than survey data, which includes, but is not limited to administrative data (data obtained under the Statistics ActFootnote 1) as well as data available to the public.

Why is this important?

As the national statistical office, Statistics Canada must ensure it delivers relevant statistical information in an efficient and transparent manner. In today's environment, Statistics Canada like many other public sector organizations is faced with continuous change and has had to innovate and discover new ways to acquire data in order to continue to serve Canadians and fulfill its mandate.

Against this backdrop, the agency has implemented an administrative data first agenda that is aligned with its modernization initiative. Within this administrative data first paradigm, the agency seeks to respond to statistical demands and ensure that data from alternative sources are acquired in a strategic, timely and transparent manner. The audit determined the extent to which the agency has implemented effective processes and controls to support these objectives, while ensuring the stewardship of agency interests.

Key audit findings

The Data Acquisition Committee provides support on a wide-range of topics, including: strategic advice on potential partnerships, data strategies and new initiatives proposed by statistical program divisions. Interviews with committee members indicated that there is some concern regarding the committee's expanding roles and responsibilities.

Although the agency has broadly identified some of the risks to acquiring data from alternative sources through various corporate mechanisms, no full risk assessment has been completed for the acquisition of data from alternative sources. The restructuring of the agency's Tier 2 governance committees offers the agency a real opportunity to strengthen oversight on risk management activities.

The Statistics Canada Data Strategy was developed to provide a roadmap for how the agency, as part of its modernization agenda, will continue to govern data assets and key strategic data capacities. The Statistics Canada Data Strategy identifies a series of short, medium and long term business objectives for acquiring access to alternative sources of data but these objectives have not yet been aligned to specific performance indicators.

Roles and responsibilities for acquiring data from alternative sources are clearly outlined and defined within Statistics Canada's policy instruments. However, the policy instruments need to be updated to reflect some recent changes to legislative requirements under section 8(2) and 8(3) of the Statistics Act and the updated role of the Data Stewardship Division (DSD) and statistical program areas in the acquisition process.

Two training courses have been offered to assist agency employees in carrying out their responsibilities for acquiring data from alternative sources. Both courses could benefit from providing guidance on the implications of the new legislative requirements under the Statistics Act and further explaining how program managers can leverage DSD and the Office of Privacy Management and Information Coordination (OPMIC) during the acquisition process.

Data acquisition agreements are used to acquire administrative data. Publicly available data do not need to be obtained under the Statistics Act and, generally speaking, are not subject to the same requirements as administrative data.

Legal oversight is provided by the OPMIC during the data acquisition process. When a data acquisition agreement is drafted, OPMIC is contacted to perform a review of the terms and conditions within the proposed agreement prior to obtaining signatures.

When the agency incurs a financial fee for acquiring data from alternative sources, a cost proposal is submitted to the agency by the provider that identifies the total cost along with supporting financial information. However, there is limited evidence to demonstrate how management reviews these costs to determine whether they are reasonable and represent value for money.

Statistical program managers were generally not using the standard evaluation questionnaire or any other suitable tool to assess data quality and fit for use. In collaboration with the Modern Statistical Methods and Data Science Branch, DSD is coordinating a more user friendly version of the quality evaluation questionnaire.

Overall conclusion

Management has implemented a governance framework to provide oversight and advance the agency's administrative data first agenda, with corporate tools in place that support this agenda and identify some of the risks to acquiring data from alternative sources. The proposed restructuring of the agency's governance committees offers an opportunity to strengthen risk management activities for the acquisition of data from alternative sources. The aim is to provide Statistics Canada with a more effective governance structure to support the acquisition of data from alternative sources. The agency has also identified business objectives for acquiring data from alternative sources although the absence of key performance indicators could hinder the agency's ability to measure progress against its business objectives.

Internal controls are in place and functioning with some minor deficiencies, which are attributed to recent legislative changes and an internal transformation of responsibilities related to the acquisition process. Through training programs and internal communication, management has taken positive steps to inform employees but should ensure this is consistently communicated across the agency.

Conformance with professional standards

The audit was conducted in accordance with the Mandatory Procedures for Internal Auditing in the Government of Canada, which include the Institute of Internal Auditors' International Standards for the Professional Practice of Internal Auditing.

Sufficient and appropriate audit procedures have been conducted, and evidence has been gathered to support the accuracy of the findings and conclusions in this report, and to provide an audit level of assurance. The findings and conclusions are based on a comparison of the conditions, as they existed at the time, against pre-established audit criteria. The findings and conclusions are applicable to the entity examined, and for the scope and period covered by the audit.

Steven McRoberts
Chief Audit and Evaluation Executive

Introduction

Background

The world is constantly changing, with new technologies regularly emerging, only to become obsolete and outdated almost overnight. With this ever-changing environment comes increased pressure for governments to have 'real-time' data to inform public policy as data are the lifeblood of decision-making in the public sector.

In the public sector, data help direct and inform decision-making by providing valuable information such as how many live in poverty; whether greenhouse gas emissions are increasing; or how public money is being spent. The success of the Canadian economy, and the prosperity of its communities depends in part on advancing programs that focus on strengthening public sector decision-making in key areas such as the financial, environmental and the social fields.

Statistics Canada plays an integral role in in this endeavour, as a foundational part of the agency's mandate is to collect and compile statistical information. For decades, the agency's business has been rooted in the administration of surveys, with data from alternative sources being used to complement its data collection. Recently, the agency acknowledged that traditional surveying of Canadians and businesses has struggled, because of increasing costs and response burden.

For the purpose of this audit, "data from alternative sources" has been defined as all data other than survey data, this includes, but is not limited to, administrative data (data obtained under the Statistics Act) as well as data available to the public. Data available to the public do not need to be obtained under the Statistics Act and are not subject to the requirements of the Statistics Act.

Administrative data first agenda

As a part of the agency's modernization initiative that focuses on user-centric service delivery and leading edge data integration, Statistics Canada has committed to an administrative data first agenda. This agenda is aimed at positioning the agency to better respond to statistical demands in the ever-changing and ever-evolving modern data world. In an administrative-first paradigm, administrative data are considered first before a survey is conducted, in order to complement or replace data acquired from a survey or to evaluate its quality.

Statistics Canada has taken steps towards realizing its administrative data first agenda by formalizing broad business objectives in a number of corporate initiatives. These objectives include:

  • identifying, and gaining timely access to data from alternative sources for statistical purposes;
  • communicating to the public in a proactive and transparent manner why the agency seeks to acquire data from alternative sources; and
  • strategically managing the acquisition of data through the effective implementation of governance and stewardship.

A period of transition

When acquiring data from alternative sources, there are a number of processes, procedures and key stakeholders that play an integrated role. The Data Stewardship Division (DSD) is responsible for the creation and implementation of sound data stewardship protocols to ensure all data assets are well managed, secure, and fit for use. The Office of Privacy Management and Information Coordination (OPMIC) provides legal oversight and is responsible for reviewing data acquisition agreements for legal implications.

The process by which the agency acquires data from alternative sources is in a period of transition as a result of shifting internal roles and responsibilities, new legal interpretations to the Statistics Act and updates to internal governance structures. In the future, these changes will impact how the agency manages its risks for acquiring data from alternative sources.

Audit objective

The objective of the audit was to provide reasonable assurance to the Chief Statistician (CS) and the Departmental Audit Committee that management has adequate processes and controls in place to support the strategic, transparent and timely acquisition of data from alternative sources while ensuring the sound stewardship of public assets and agency interests.

Scope

The audit scope included an examination of acquisitions of data from alternative sources from private sector organizations where negotiations to acquire data from alternative sources began or were completed during fiscal years 2017/2018 to 2018/2019, including acquisitions that were underway or not yet completed as at May 31, 2019. The audit focused on three key controls areas, namely: governance, tools and training and internal controls.

Statistics Canada also acquires data from alternative sources from provinces, territories and federal departments and agencies. It was determined during the planning phase of the audit that acquiring data from the public sector poses less risk to the agency and for this reason, public sector data acquisitions were excluded from the scope of the audit.

Approach and methodology

This audit was conducted in accordance with the Mandatory Procedures for Internal Auditing in the Government of Canada, which include the Institute of Internal Auditors' International Standards for the Professional Practice of Internal Auditing. Field work consisted of a review of applicable processes, activities and tools to ensure compliance with Statistics Canada legislative and policy requirements.

Authority

The audit was conducted under the authority of the approved Statistics Canada Integrated Risk-based Audit and Evaluation Plan 2019/2020 to 2023/2024.

Audit findings, recommendations and management response

The data acquisition process

The Data Acquisition Committee (DAC) provides support on a wide-range of topics, including: strategic advice on potential partnerships, data strategies and new initiatives proposed by statistical program divisions. Interviews with committee members indicated that there is some concern regarding the committee's expanding roles and responsibilities.

Although the agency has broadly identified some of the risks to acquiring data from alternative sources through various corporate mechanisms, there has not been a full risk assessment completed for the acquisition of data from alternative sources. The restructuring of the agency's Tier 2 governance committees offers the agency a real opportunity to strengthen oversight on risk management activities.

The Statistics Canada Data Strategy (SCDS) was developed to provide a roadmap for how the agency, as part of its modernization agenda, will continue to govern data assets and key strategic data capacities. The SCDS identifies a series of short, medium and long term business objectives for acquiring access to alternative sources of data but these objectives have not yet been aligned to specific performance indicators.

Roles and responsibilities for acquiring data from alternative sources are clearly outlined and defined within Statistics Canada's policy instruments. However, the policy instruments need to be updated to reflect some recent changes to legislative requirements under Section 8(2) and 8(3) of the Statistics Act and the updated role of the DSD and statistical program areas in the acquisition process.

Two training courses have been offered to assist agency employees in carrying out their responsibilities for acquiring data from alternative sources. Both courses would benefit from providing guidance on the implications of the new legislative requirements under the Statistics Act and further explaining how program managers can leverage DSD and OPMIC during the acquisition process.

As the national statistics office, Statistics Canada is constantly exploring ways to acquire alternative sources of data to better support its statistical programs and provide Canadians with valuable information. Statistics Canada can acquire data from alternative sources from both the public and the private sectors and has acknowledged their importance in supplementing its statistical programs through the agency's modernization initiative. The DSD is responsible for developing sound data stewardship protocols to ensure all data assets are well managed, secure, and fit for use. DSD has developed an acquisition process that identifies the steps to acquire new sources of data from alternative sources. This process ensures that all proposed data acquisitions are consistently performed using a common approach.

Data acquisitions are contingent upon a number of key factors, including: the complexity of the acquisition, the sensitivity of the data and the pre-existing relationship with the data provider. Normally, the process begins with statistical program managers identifying data from alternative sources collected and held by other organizations that can support the legal mandate of Statistics Canada. Once a need has been identified, a request is sent to DSD to determine whether the data will be for broad use or localized use.

Once the request is complete and the type of data is determined, DSD and OPMIC provide guidance to the program areas on how to proceed with the acquisition. With support from DSD, program managers establish a strategy to reach out to the data provider and initiate early discussions with key stakeholders. The next step is to begin negotiations with the data provider. This is a key point in the process when Statistics Canada meets with representatives from the data provider to discuss the data being requested, legal and privacy expectations and any associated costs. It should be noted that Statistics Canada does not pay for data but rather for the time and effort required to compile the intended data sources.

When acquiring data from alternative sources with an associated cost, it can be obtained under the Statistics Act (data sharing agreement). Data can also be acquired without the use of the Statistics Act through a contract or written communication. When the data are obtained under the Statistics Act, the statistical program area is responsible for initiating expenditure initiation under section 32 of the Financial Administration Act and performing a financial review process with support provided by the OPMIC and DSD. For data acquisitions that do not fall under the authority of the Statistics Act, Corporate Support Services Division (CSSD) is responsible for reviewing the proposed costs and determining whether the proposal should be accepted for payment. Through interviews, the audit found that CSSD's departmental delegation of financial authorities for services is $100,000 and any contract above that threshold would be sent to Public Services and Procurement Canada.

An agreement is formalized through a data acquisition agreement, contract or written communication once all aspects of the negotiations have been finalized, including any potential cost. Typically, the agency begins receiving synthetic or test data from the provider and quality assessments must be completed to determine whether the data are usable within the agency's existing data environment. If the data are deemed fit for use, statistical program managers finalize and sign the data acquisition agreement with the data provider. The OPMIC plays a key role in this step as they are responsible for ensuring that the agreement contains terms and conditions that protect the legal interests of the agency.

Through consultation with DSD, the audit found that DSD's role in the acquisition process is expected to change. The intent is for program divisions to carry out their own data acquisitions and DSD to act as an intermediary on all matters related to data acquisitions. DSD's intent is to communicate these changes and make this process visible but at the time of audit, no timeline had been established.

The data acquisition process described above is supported through a series of controls including governance committees, training courses and internal policy instruments. All play an important role in identifying, and gaining timely access to data and communicating to the public why the agency seeks to acquire for statistical purposes. The audit determined the extent to which these controls were operating and comes at a time when the agency is undergoing significant changes as it looks to advance its administrative data first agenda.

Governance mechanisms are undergoing changes that are expected to strengthen risk management.

The DAC was created in February 2018 as a result of consolidating the Administrative Data Management Committee, Collection Planning Committee and Business Response Management Committee. The DAC's mandate is to provide leadership and direction for data acquisitions with the objective of implementing and maintaining an "administrative data first agenda," The DAC is made up of 18 members that include senior managers, directors and director generals from multiple fields across the agency. The committee meets approximately once per month and provides support on a wide-range of topics, including: strategic advice on potential partnerships, data strategies and new initiatives proposed by statistical program divisions.

Interviews with committee members indicated that there is some concern regarding the committee's expanding roles and responsibilities. DAC was originally intended to be a strategic-level oversight body to provide high-level guidance on activities directly linked to acquiring data from alternative sources. However, the audit found that some of the DAC's activities are more ad-hoc in nature and not always within its intended mandate. For example, the committee was tasked with determining whether certain surveys should be made mandatory or voluntary. It is important that the DAC operate within its intended mandate to ensure that there is adequate time and resources to address key activities related specifically to the acquisition of data from alternative sources.

Moving forward, the agency has an opportunity to strengthen its governance with the proposed changes to the existing Tier-2Footnote2 governance structure. These modifications are designed to consolidate the 10 existing Tier-2 governance committees into 5 new committees. The proposed 'Data in: Data Acquisition and Management' is one of the five new committees and will aim to provide Statistics Canada with a more effective governance structure to support the acquisition of data from alternative sources. The committee will be responsible for identifying risks and issues related to acquiring data from alternative sources and implementing mitigation strategies to ensure the agency is able to respond proactively to change and uncertainty. The new responsibilities of the proposed committee are needed as risk management oversight for acquiring data from alternative sources needs to be strengthened to enable more effective decision-making throughout the agency.

The audit confirmed that no risk assessment has been completed for the acquisition of data from alternative sources, although the agency has broadly identified some of the risks to acquiring data from alternative sources through various corporate mechanisms.

Through interviews, agency employees indicated that there might be some merit to undertaking a risk assessment. The restructuring of the agency's Tier 2 governance committees offers the agency a real opportunity to strengthen risk management activities for acquiring data from alternative sources as it will improve decision-making in governance, strategy and objective setting. However, at the time of the audit this restructuring had not yet been fully implemented.

Work is required to ensure that key performance measures are in place.

The CS has stated the importance of continuing to modernize each step of the statistical process, by modernizing and embracing an administrative data first agenda. To achieve this, Statistics Canada has embedded performance measures in corporate initiatives like the SCDS. The SCDS was developed to provide a roadmap for how the agency, as part of its modernization agenda, will continue to govern data assets and key strategic data capacities. The SCDS is organized into two main pillars: data governance and data stewardship. Under data stewardship, there are a series of 7 'strategic data capacities' that are intended to have performance measurement criteria that would track progress against specific targets.

One of the 7 strategic data capacities is: 'data discovery'. This area links directly to the administrative data first agenda and identifies a series of short, medium and long term business objectives for acquiring access to alternative sources of data. These include: outlining why the agency seeks to acquire data from alternative sources, establishing mechanisms to include community engagement and implementing processes to support the acquisition of data from multiple access points across the agency.

However, the SCDS does not contain any objectives or performance measures related to the 'timeliness' and 'speed' with which the agency acquires data from alternative sources even though through interviews, agency employees stressed the importance of gaining access to data from alternative sources in a timely manner.

Overall, although the SCDS provides short, medium and long term business objectives for acquiring data from alternative sources, these objectives have not been aligned to performance indicators. Without these indicators, the agency cannot fully track its progress against key objectives that include timeliness and transparency in order to determine whether objectives for acquiring data from alternative sources are being achieved. In discussion with DSD, measurement criteria for the acquisition of data from alternative sources, as outlined in the SCDS, will be developed with program areas and in consultation with Corporate Strategy and Management Field. It was indicated by DSD that the indicators related to the short term business objectives will be developed by early 2020.

Updates are required to policy instruments to ensure they are aligned with new legislative requirements and DSD's new role.

The Policy on the Use of Administrative Data Obtained under the Statistics Act and the Directive on Obtaining Administrative Data under the Statistics Act are the supporting internal policy instruments that provide direction on the acquisition and use of data from alternative sources. Overall, they effectively outline the roles and responsibilities of key stakeholders across the agency and define the operational steps involved in acquiring data from alternative sources.

DSD has played a significant role in the acquisition of data from new alternative sources. Its responsibilities included: acquiring administrative data sources that had a broad scope and supporting statistical programs in their acquisition of administrative data. As previously stated, DSD's role in the acquisition process is expected to change. Program divisions will be expected to carry out their own data acquisitions and DSD will act as an intermediary on all matters related to data acquisitions. DSD indicated that they intend to make this new process visible and update the policy instruments by the end of fiscal year 2019-2020.

The policy instruments have not yet been updated to reflect the new legal interpretation. Under the new legal interpretation (provided to the agency in February 2018) there are two modifications. Specifically, under section 8(2), the CS must now publish any mandatory request for information before a request is submitted to a data provider. Under section 8(3), the CS must notify the Minister of any new mandatory request for information at least 30 days before it is published.

Without these updates in the policy instruments, the agency increases the risk that employees may not be aware of, or fully understand these requirements. This could hinder the agency's ability to acquire data in an efficient and effective manner. Although work is underway to update the policy instruments, it has not yet been finalized.

Training courses cover key components of the data acquisition process, but there are opportunities for improvement.

Providing effective training to agency employees presents a unique opportunity to expand their knowledge base and allows them to strengthen skills unique to their job responsibilities. Agency employees who understand how to leverage interpersonal skills within the negotiation process will assist the agency in acquiring data from alternative sources in a more expedient manner. However, for this to occur employees must be aware of and have adequate access to training opportunities so that they can develop the requisite knowledge and skills for acquiring data from alternative sources.

Since 2017-18, two training courses have been offered to employees at the agency to assist them in carrying out their responsibilities for acquiring data from alternative sources. However, both courses contain areas that could be improved. 'Obtaining Administrative Data under the Statistics Act' was a course developed under the direction of DSD and provides guidance on governing instruments, concepts and tools on how to overcome challenges during the data acquisition process. The course was last delivered to agency employees in January 2018, and does not include any guidance on the new legal interpretation of the Statistics Act that applies to data from alternative sources. It should be noted that DSD is currently updating the training course to address the implications of the new legislative requirements, and DSD's new role in the data acquisition process.

'Skillful Negotiations' was developed under the direction and guidance of the Employee Development and Wellness Division in order to assist agency employees in carrying out effective negotiations. The course effectively provides guidance on carrying out skillful negotiations including: identifying team roles, outlining different types of negotiations and how to leverage interpersonal skills. It also provides participants with the opportunity to provide feedback on the overall course content, including each of the four main training modules. Although the overall level of satisfaction from course participants is high (86.5%) the course could be improved by including specific guidance on the new legislative and legal requirements under the Statistics Act and how to effectively utilize DSD and OPMIC when seeking to acquire data from alternative sources.

Recommendation 1

It is recommended to the Assistant Chief Statistician, Analytical Studies, Methodology and Statistical Infrastructure to ensure that the realigned governance structure embeds effective risk management oversight for acquiring data from alternative sources.

Management response

Management agrees with the recommendation.

  • 1.1 Through the restructuring of the agency's Tier 2 governance committees, a governing body will be identified. The committee's responsibility will include risk management activities for the acquisition of data from alternative sources.
  • 1.2 An improved governance process for the acquisition of data from alternative sources will be developed and documented. Supporting material will include explicit documentation of the entire governance process using flow diagrams as well as templates to support efficient documentation of:
    1. specific steps of the governance process;
    2. the steps considered to mitigate risks;
    3. the record of decisions; and,
    4. accountability.
  • 1.3 Programs areas will be informed about the new governance process related to the acquisition of data from alternative sources.
Deliverables and timeline

The Director General of the Strategic Data Management Branch will identify a governing body whose responsibility will include risk management activities for the acquisition of data from alternative sources by March 2020.

The Director of the Data Stewardship Division will develop an improved governance process for the acquisition of data from alternative sources that will include key elements to mitigate risks and document accountability by June 2020.

The Director of the Data Stewardship Division and Director of the Office of the Chief Editor will:

  • Develop a communication plan to socialize the new governance process to managers and employees involved in the acquisition of data from alternative sources by March 2020.
  • Communicate activities defined in the communication plan by March 2021.

Recommendation 2

It is recommended to the Assistant Chief Statistician, Analytical Studies, Methodology and Statistical Infrastructure to ensure that key performance indicators and measurable outcomes are developed and aligned to business objectives for acquiring data from alternative sources in the Statistics Canada Data Strategy.

Management response

Management agrees with the recommendation.

  • 2.1 Key performance indicators associated with the acquisition of data from alternative sources will be defined.
Deliverables and timeline

The Director of the Data Stewardship Division will establish key performance indicators that will measure the effectiveness of the governance processes associated with the acquisition of data from alternative sources by November 2020.

Recommendation 3

It is recommended to the Assistant Chief Statistician, Analytical Studies, Methodology and Statistical Infrastructure to ensure that training courses and policy instruments are updated to ensure agency employees are provided with appropriate guidance and direction on new legislative requirements under the Statistics Act and DSD's new role in the acquisition of data from alternative sources.

Management response

Management agrees with the recommendation.

  • 3.1 Relevant policy instruments will be updated to reflect the latest legal and operational requirements based on the new governance process.
  • 3.2 Up to date training material related to the acquisition of alternate data will become a mandatory requirement for staff involved in the acquisition of data. Mandatory training centered on Statistics Canada's legal obligation and associated governance operations will be delivered.
Deliverables and timeline

The Director General of the Strategic Data Management Branch will:

  • Update versions of the following policy instruments by June 2020 (draft versions) and November 2020 (final versions):
    • Policy on the Use of Administrative Data Obtained under the Statistics Act
    • Directive on Obtaining Administrative Data under the Statistics Act
    • Guidelines on data available to the public
  • Update mandatory training material that reflects legal and operational requirements by February 2021.

Review of data acquisition files

Data acquisition agreements are used to acquire administrative data. Publicly available data do not need to be obtained under the Statistics Act and, generally speaking, are not subject to the same requirements as administrative data.

Legal oversight is provided by the OPMIC during the data acquisition process. When a data acquisition agreement is drafted, OPMIC is contacted to perform a review of the terms and conditions within the proposed agreement prior to obtaining signatures.

When the agency incurs a financial fee for acquiring data from alternative sources, a cost proposal is submitted to the agency by the provider that identifies the total cost along with supporting financial information. However, there is limited evidence to demonstrate how management reviews these costs to determine whether they are reasonable and represent value for money.

Statistical program managers were generally not using the standard evaluation questionnaire or any other suitable tool to assess data quality and fit for use. In collaboration with the Modern Statistical Methods and Data Science Branch, DSD is coordinating a more user friendly version of the quality evaluation questionnaire.

The Directive on Obtaining Administrative Data under the Statistics Act requires agency employees to implement a series of internal controls within the acquisition process to protect and safeguard the agency. To provide reasonable assurance that controls were in place and functioning as intended, the audit examined 12 private sector data acquisitions (7 administrative data files and 5 publicly available data files) that had been completed within the audit scope. The key controls assessed included:

  • use of a data acquisition agreement
  • legal oversight
  • financial oversight
  • quality assessment

Data acquisition agreements are used to acquire non-publicly available data but less so for publicly available data.

Under section 5.1.2 of the directive, the preferred method to obtain administrative data from private organizations is a data acquisition agreement, pursuant to the Statistics Act. However, the directive prescribes that an agreement may also take the form of an exchange of written communications or a contract, assuming it contains all the essential elements of a data acquisition agreement (i.e. legal authority to obtain data, its intended use or any legal requirements to protect it).

The audit reviewed seven administrative data acquisition files (obtained under the Statistics Act) to determine the extent to which the agency documents the terms and conditions of data agreements. In all seven files, the agency formalized data agreements that were aligned with the requirements under the Directive.

Publicly available data do not need to be obtained under the Statistics Act and, therefore are not subject to the same requirements as administrative data. The Guideline on data available to the public does not require or suggest that formal data acquisition agreements should be used when acquiring data available to the public.

The audit reviewed five acquisitions of publicly available data and found that three of the five files did not contain any form of agreement. The remaining two files contained either a formal license agreement or an exchange of written communications with the data provider. Under normal circumstances, there is no requirement to enter into an agreement for publicly available data.

Adequate legal oversight is provided by the Office of Privacy Management and Information Coordination.

OPMIC is responsible for reviewing all new agreements and working with legal counsel to ensure that the appropriate terms and conditions are in place to protect the agency's legal interests. Specifically, section 6.2.4.5 of the directive states that the Director of OPMIC must: 'support statistical program managers by reviewing all final data acquisition agreements before they are sent to the other organization.' As the directive does not specify what the review must include, we found that the review function varied depending on the complexity of the data acquisition.

Overall, when a data acquisition agreement was drafted, OPMIC was contacted to perform a review of the terms and conditions within the proposed agreement prior to obtaining signatures. The audit found that OPMIC provided legal oversight for all seven administrative data files (obtained under the Statistics Act). The exceptions where OPMIC did not perform a review included the file that contained an exchange of written communications and the three publicly available files without an agreement. As the Directive only identifies a requirement when there is a data acquisition agreement in place, OPMIC is not expected to be involved in the acquisition process in the absence of a formal agreement.

Financial reviews are taking place, but are not always formally documented.

When Statistics Canada acquires data from private sector organizations, the agency can acquire the data with no associated costs or pay the organization to compile the requested data. It is important to note that Statistics Canada does not pay for the data itself, rather, for the time and effort required to compile the intended data sources. DSD indicated that when there is a cost, the data provider develops a cost proposal that outlines the variables included in the final price. The cost proposal is submitted to Statistics Canada for review and acceptance. Part of the decision making process is to review the quality evaluation assessment to determine whether the value of the data is commensurate with its potential costs.

The directive and policy outline the requirements for financial oversight for data acquisitions with a cost. Specifically section 6.6 of the directive states that private sector acquisitions with a cost are to be provided to the CSSD for financial review. However, as of March 2018, CSSD was no longer involved in carrying out financial reviews for acquisitions that fall under the Statistics Act. Instead, a new process was implemented in which the responsibility to conduct a financial review of costs now falls under the responsibility of the statistical program manager and senior management with support from DSD and OPMIC. The agency's policy instruments have not been updated to reflect this change in process. They also do not prescribe any requirements for how program managers can demonstrate that they have considered whether costs are reasonable and represent value for money. Through interviews with program managers, the audit found that the financial review process that is in place includes exercising due diligence for expenditure initiation under section 32 of the Financial Administration Act.

The audit reviewed 4 acquisition files with an associated cost that were obtained under the Statistics Act to determine whether 1) a cost proposal was provided and 2) whether the cost proposal was reviewed to validate the costs for reasonableness. In all four files examined, the data provider submitted a cost proposal. In three of the four files, there was email documentation demonstrating that management was aware of and accepted the costs being proposed by the data provider. However, each of the four files did not contain any analysis to demonstrate how management determined that costs were reasonable in order to ensure value for money and that the agency's financial interests were protected.

Quality reviews of data from alternative sources are generally not being carried out.

Statistics Canada policy instruments are clear on the importance of and responsibilities surrounding the quality assessment of new data from alternative sources. As stated in section 6.4 of the directive, senior managers of statistical programs are responsible for assessing the quality of potential administrative data and their statistical usability, by using the data quality evaluation framework maintained by DSD or any other suitable tool. Contained within the data quality evaluation framework, is an evaluation questionnaire tool that is available for managers to assess new sources of data. The quality review process is an important step because it helps the agency to determine the value of the data, which in turn, helps to rationalize potential costs.

The audit reviewed the 12 acquisition files to determine whether a quality assessment was completed prior to obtaining the data. Overall, the audit found that program managers were generally not using DSD's data quality evaluation questionnaire or any other suitable framework. Four of the 12 files had documentation to demonstrate that a quality review took place. Of the four files that had completed a quality review, three were administrative data and one was publicly available data. The remaining eight files did not contain documentation to demonstrate that a quality review was completed prior to signing the data acquisition agreement. Through consultation with DSD, we were informed that statistical program managers were not using the standard evaluation questionnaire because of a perceived time burden. The audit also found cases in which the individuals responsible for the quality review were no longer in their roles resulting in documentation not being available. Without consistently evaluating the quality of data from alternative sources, there is a risk that statistical programs may acquire data that are incomplete or incompatible with their programming needs.

DSD has since taken steps to address this by updating the quality evaluation questionnaire to ensure it is more streamlined. In interviews with DSD they have stated that the updated quality evaluation framework, as developed by the Modern Statistical Methods and Data Science Branch, will be completed by the end of fiscal year 2019-2020 and the intent is to have it available for use in early 2020.

Recommendation 4

It is recommended to the Assistant Chief Statistician, Analytical Studies, Methodology and Statistical Infrastructure to ensure that the data quality evaluation questionnaire is updated and that agency employees are aware of the requirement to assess data for quality and statistical usability prior to acquiring data from alternative sources.

Management response

Management agrees with the recommendation.

  • 4.1 Employees will be informed regarding their roles and responsibilities associated with the quality assessment of data from alternative sources. Accountability regarding the documentation and registration of the quality assessment process will be embedded in the updated data governance process for the acquisition of data from alternative sources.

Deliverables and timeline

The Director of the Data Stewardship Division, Director of the Office of the Chief Editor and the Director of International cooperation and Methodology Innovation Centre will:

  • Develop a communication plan by May 2020 to inform managers regarding:
    • specific requirements associated with the quality assessment of data sources;
    • the quality assessment tools available; and,
    • the corporate tools available to register the quality assessment evaluation document.
  • Communicate activities defined in the communication plan by March 2021.

Appendices

Appendix A: Audit criteria

Audit criteria
Control objectives / Core controls / Criteria Sub-criteria Policy instruments/Sources

Audit objective: Provide reasonable assurance to the Chief Statistician and the Departmental Audit Committee that management has adequate processes and controls in place to support the strategic, transparent and timely acquisition of data from alternative sources while ensuring the sound stewardship of public assets and agency interests.

  • 1. An effective governance framework is in place over the acquisition of data from alternative sources to support agency objectives.
  • 1.1 Oversight bodies are in place and have clearly communicated mandates that include roles with respect to risk management and control.
  • 1.2 Management requests and receives sufficient, complete, timely and accurate information from oversight bodies to inform risk-based decision making in relation to acquiring data from alternative sources.
  • 1.3 Roles, responsibilities and accountabilities, as defined in key policy instruments, are clearly understood.
  • Audit Criteria related to Management Accountability Framework (MAF): A tool for Internal Auditors
  • Statistics Canada's Quality Assurance Framework
  • 2. Employees are provided with the necessary processes, tools and training to support the acquisition of data from alternative sources.
  • 2.1 A comprehensive training plan is in place to support agency officials related to their work in negotiating data acquisition agreements.
  • 2.2 The training plan is being provided to employees across the agency.
  • 2.3 The training plan incorporates feedback and lessons learned from agency officials, in order to address specific challenges/barriers in negotiating data acquisition agreements.
  • 2.4 Employees have access to and are aware of sufficient processes and tools such as software, equipment and standard operating procedures to support their work in negotiating data acquisition agreements.
  • 3. Processes and controls are in place and working effectively to ensure that the agency's interests are protected when acquiring data from alternative sources.
  • 3.1 Data from alternative sources acquisitions are formally documented.
  • 3.2 There is appropriate legal oversight of data acquisitions to ensure adequate protection to the agency.
  • 3.3 There is appropriate financial oversight of data acquisitions to ensure adequate protection to the agency.
  • 3.4 The quality of administrative/ data from alternative sources is assessed prior to its acquisition.

Appendix B: Initialisms

CS
Chief Statistician
CSSD
Corporate Support Services Division
DAC
Data Acquisition Committee
DSD
Data Stewardship Division
SCDS
Statistics Canada Data Strategy
OPMIC
Office of Privacy Management and Information Coordination

Manufacturing and Wholesale Trade (Monthly) - December 2018 to December 2019: National Level CVs by Characteristic

National Level CVs by Characteristic
Month Sales of goods manufactured Raw materials and components inventories Goods / work in process inventories Finished goods manufactured inventories Unfilled Orders
%
December 2018 0.59 0.94 1.23 1.34 1.13
January 2019 0.60 0.94 1.21 1.29 1.26
February 2019 0.62 0.93 1.22 1.26 1.13
March 2019 0.59 0.94 1.22 1.32 1.11
April 2019 0.60 0.96 1.20 1.33 1.16
May 2019 0.61 0.94 1.20 1.34 1.09
June 2019 0.58 0.94 1.18 1.38 1.15
July 2019 0.64 0.92 1.12 1.33 1.12
August 2019 0.61 0.92 1.18 1.34 1.11
September 2019 0.60 0.92 1.16 1.38 1.07
October 2019 0.60 0.93 1.18 1.39 1.14
November 2019 0.59 0.96 1.20 1.37 1.15
December 2019 0.57 0.99 1.29 1.38 1.10

National Weighted Rates by Source and Characteristic, December 2019

National Weighted Rates by Source and Characteristic, December 2019
Characteristics Data source
Response or edited Imputed
%
Sales of goods manufactured 91.0 9.0
Raw materials and components 84.5 15.5
Goods / work in process 86.4 13.6
Finished goods manufactured 83.1 16.9
Unfilled Orders 92.2 7.8
Capacity utilization rates 78.3 21.7

Quarterly Survey of Securitized Receivables and Asset-Backed Securities (F15)

Reporting entity

1. Indicate which type of corporation this report covers.

  1. A single corporation
  2. Part of a corporation
  3. A consolidated family of corporations
  4. Other (specify)

2. Is the reporting entity part of a Canadian consolidation?

  1. Yes
  2. No

3. Does this reporting entity have investments in partnerships or joint ventures?

  1. Yes
  2. No

4. Indicate the accounting standard used to complete this questionnaire.

  1. International Financial Reporting Standards (IFRS)
  2. Accounting Standards for Private Enterprises (ASPE)
  3. United States Generally Accepted Accounting Principles (U.S. GAAP)
  4. Other (specify)

5. Indicate the currency used to complete this survey.

  1. Canadian dollars
  2. U.S. dollars

6. What are the start and end dates of this enterprise's reporting period for the quarter ending:

From: YYYY-MM-DD to YYYY-MM-DD

Assets

7. Report your assets

  1. Cash and deposits – Canadian currency
  2. Cash and deposits – foreign currency
  3. Accounts receivable
  4. Allowance for credit losses on receivables
  5. Canadian investments in non-affiliates ─ debt securities issued by the Government of Canada
    • e.1 Term-to-maturity of less than one year
    • e.2 Term-to-maturity of one year or more
  6. Canadian investments in non-affiliates ─ debt securities issued by provincial and municipal governments
    • f.1 Term-to-maturity of less than one year
    • f.2 Term-to-maturity of one year or more
  7. Canadian investments in non-affiliates ─ debt securities issued by corporations, trusts or others
    • g.1 Term-to-maturity of less than one year
    • g.2 Term-to-maturity of one year or more
  8. Canadian investments in non-affiliates ─ corporate shares, fund or trust units and other equity
    • h.1 Publicly traded
    • h.2 Other equity
  9. Canadian investments in non-affiliates ─ other investments
  10. Foreign investments in non-affiliates ─ debt securities
    • j.1 Term-to-maturity of less than one year
    • j.2 Term-to-maturity of one year or more
  11. Foreign investments in non-affiliates ─ other investments
  12. Derivative assets
  13. Reverse repurchase agreements
  14. Mortgage loans to non-affiliates ─ secured by property in Canada
    • n.1 Residential ─ NHA insured
    • n.2 Residential ─ non-NHA insured
    • n.3 Non-residential
  15. Mortgage loans to non-affiliates ─ secured by property outside Canada
  16. Mortgage loans to non-affiliates ─ accumulated allowance for credit losses
  17. Non-mortgage loans to non-affiliates
    • q.1 To individuals and unincorporated businesses ─ credit cards
    • q.2 To individuals and unincorporated businesses ─ lines of credit
    • q.3 To individuals and unincorporated businesses ─ other loans
    • q.4 To corporations
    • q.5 To others
  18. Non-mortgage loans to non-affiliates ─ accumulated allowance for credit losses
  19. All other assets
    Specify all major items within other assets
  20. Other allowances for credit losses
    Total assets

Liabilities and equity

8. Report your liabilities.

  1. Accounts payable
  2. Amounts owing to affiliates
    • b.1 In Canada
    • b.2 Outside Canada
  3. Borrowing from non-affiliates ─ mortgage loans
    • c.1 Residential
    • c.2 Non-residential
  4. Borrowing from non-affiliates ─ non-mortgage loans and overdrafts
    • d.1 From lenders in Canada ─ banks
    • d.2 From lenders in Canada ─ credit unions
    • d.3 From lenders in Canada — other lenders in Canada
    • d.4 From lenders outside Canada
  5. Borrowing from non-affiliates ─ asset-backed securities
    • e.1 Term-to-maturity of less than one year
    • e.2 Term-to-maturity of one year or more
  6. Borrowing from non-affiliates ─ subordinated debt
  7. Borrowing from non-affiliates ─ other borrowings
  8. Derivative liabilities
  9. Obligations related to repurchase agreements
  10. Accrued pension liability
  11. Non-pension post retirement benefits
  12. All other liabilities
    Specify all major items within other liabilities
    Total liabilities

9. Report your equity.

  1. Share capital
    • a.1 Preferred
    • a.2 Common
  2. Accumulated other comprehensive income
  3. Retained earnings
    • c.1 Opening balance
    • c.2 Net income (loss) for the current period
    • c.3 All other additions (deductions)
      Specify all major items within other additions (deductions)
    • c.4 Reinvestment of income in additional trust equity units
  4. Dividends declared
    • d.1 Cash ─ preferred shares
    • d.2 Cash ─ common shares
    • d.3 Other dividends
      Closing balance
      Total equity
  5. Total liabilities and total equity

Income statement 

10. What period does this income statement cover?

From: YYYY-MM-DD to YYYY-MM-DD

11. Report your revenue.

  1. Interest revenue from Canadian sources
    • a.1 Debt securities
    • a.2 Mortgages
    • a.3 Consumer loans
    • a.4 Other interest revenue
  2. Interest revenue from foreign sources
  3. Dividends
    • c.1 From Canadian corporations
    • c.2 From foreign corporations
  4. Gains and losses ─ fair value adjustments
    • d.1 Realized
    • d.2 Unrealized
  5. Gains and losses ─ foreign exchange
    • e.1 Realized
    • e.2 Unrealized
  6. All other revenues
    Specify all major items within other revenues
    Total revenue

12. Report your expenses.

  1. Depreciation and amortization
    • a.1 Depreciation
    • a.2 Amortization ─ intangible assets
    • a.3 Amortization ─ other
  2. Software and research development
  3. Interest expense
    • c.1 Asset-backed securities ─ debt securities with term-to-maturity of less than one year
    • c.2 Asset-backed securities ─ debt securities with term-to-maturity of one year or more
    • c.3 Subordinated debt
    • c.4 Other interest expense
  4. All other expenses
    Specify all major items within other expenses
    Total expenses

13. Report your income. 

  1. Net income (loss)
    • a.1 Attributable to non-controlling interest
    • a.2 Attributable to equity shareholders
  2. Other comprehensive income
    • b.1 Items that will not be reclassified to net earnings
    • b.2 Items that may be reclassified subsequently to net earning
    • b.3 Reclassification of realized (gains) losses to net earnings
    • b.4 Income taxes
  3. Comprehensive income
    • c.1 Attributable to non-controlling interest
    • c.2 Attributable to equity shareholders

    Disclosure of selected accounts

14. Report other disclosures.

  1. Equity method dividends
    • a.1 Canadian dividends
    • a.2 Foreign dividends
  2. Capitalized expenses for software, research and development

15. Allocate the changes to selected assets and liabilities.

  1. Canadian and foreign investments in non-affiliates ─ debt securities
    • a.1 Initial balance
    • a.2 Net (purchases-sales or issuances-repayments and other changes)
    • a.3 Fair value adjustments and foreign exchange valuation adjustments
    • a.4 Other adjustments
      Closing balance
    • a.5 Realized gains and losses
  2. Canadian and foreign investments in non-affiliates ─ corporate shares, funds or trust units and other equity
    • b.1 Initial balance
    • b.2 Net (purchases-sales or issuances-repayments and other changes)
    • b.3 Fair value adjustments and foreign exchange valuation adjustments
    • b.4 Other adjustments
      Closing balance
    • b.5 Realized gains and losses
  3. Canadian and foreign investments in non-affiliates ─ other investments in non-affiliates
    • c.1 Initial balance
    • c.2 Net (purchases-sales or issuances-repayments and other changes)
    • c.3 Fair value adjustments and foreign exchange valuation adjustments
    • c.4 Other adjustments
      Closing balance
    • c.5 Realized gains and losses
  4. Mortgage loans to non-affiliates
    • d.1 Initial balance
    • d.2 Net (purchases-sales or issuances-repayments and other changes)
    • d.3 Fair value adjustments and foreign exchange valuation adjustments
    • d.4 Other adjustments
      Closing balance
    • d.5 Realized gains and losses
  5. Non-mortgage loans to non-affiliates
    • e.1 Initial balance
    • e.2 Net (purchases-sales or issuances-repayments and other changes)
    • e.3 Fair value adjustments and foreign exchange valuation adjustments
    • e.4 Other adjustments
      Closing balance
    • e.5 Realized gains and losses
  6. Other assets
    • f.1 Initial balance
    • f.2 Net (purchases-sales or issuances-repayments and other changes)
    • f.3 Fair value adjustments and foreign exchange valuation adjustments
    • f.4 Other adjustments
      Closing balance
    • f.5 Realized gains and losses
  7. Asset-backed securities
    • g.1 Initial balance
    • g.2 Net (purchases-sales or issuances-repayments and other changes)
    • g.3 Fair value adjustments and foreign exchange valuation adjustments
    • g.4 Other adjustments
      Closing balance
    • g.5 Realized gains and losses
  8. Other liabilities
    • h.1 Initial balance
    • h.2 Net (purchases-sales or issuances-repayments and other changes)
    • h.3 Fair value adjustments and foreign exchange valuation adjustments
    • h.4 Other adjustments
      Closing balance
    • h.5 Realized gains and losses
  9. Derivatives (assets and liabilities)
    • i.1 Initial balance
    • i.2 Net (purchases-sales or issuances-repayments and other changes)
    • i.3 Fair value adjustments and foreign exchange valuation adjustments
    • i.4 Other adjustments
      Closing balance
    • i.5 Realized gains and losses

Video - Exploring the Attribute Table and Layer Properties Box of Vector Data

Catalogue number: Catalogue number: 89200005

Issue number: 2020005

Release date: February 17, 2020

QGIS Demo 5

Exploring the Attribute Table and Layer Properties Box of Vector Data - Video transcript

(The Statistics Canada symbol and Canada wordmark appear on screen with the title: "Demo 5 - Exploring the Attribute Table and Layer Properties Box of Vector Data")

Following up from interacting with datasets in the Map Canvas, today we'll explore additional information and parameters found in the Attribute Table and Layer Properties Box. The Attribute Table contains additional variables for analyzing and visualizing vector data, while the Layer Properties box contains tabs that summarize information and provide additional functions. We'll quickly summarize some of the key tabs, their content and use, which we'll cover in detail in later demos.

So to open the Attribute Table of a layer, we can left-click it in the Layers panel and select the Attribute table icon, or right-click the layer and select Open Attribute Table.

So within the table, each column reports an additional variable tied to the vector dataset. These are referred to as fields within GIS, whereas each row corresponds to a specific feature or geometry within the canvas.

Using the tabs on the left-hand side we can select features. With an individual feature selected, we can right-click and Zoom to the Feature, and if we still couldn't see it we could also flash the feature. In this case we can't see our feature as it's hidden by our Census Subdivision layer.

Like the Interactive Selection tools, we can use Shift and Control to select multiple features. Using shift to select features within a range and Ctrl to add individual features. In conjunction, selecting features both within and between ranges. We could also then zoom to our selection. So as you can see, when features are selected in the Attribute Table they are also highlighted in the Canvas and vice-versa – highlighted in yellow in the Canvas and blue in the Attribute Table.

To sort a field, ascending or descending we can left-click once or twice on the field name as needed. This can help select features by specific criteria of interest such as selecting all features within a particular province in this case. We can then also zoom once more and using the Invert feature selection, we can switch the selection of features.

To move selections to the top of the attribute table, we can click the Move Selection to Top icon. So now if we add any additional features to our selection they are by default loaded at the top of the table. We could also copy our information and paste it into an external spreadsheet editor for further analysis.

Expanding the Show All Features dropdown, we could apply a field filter, selecting the field to filter by and specific criteria to use in filtering the table. Subsequently, the only remaining entries are those that satisfy the entered criteria, in this case Province name being Manitoba.

If we want a dynamic representation of our attribute features based on the scale and extent, we can apply a Show Features Visible on Map. Now if we change the scale or change the location, our table is filtered accordingly.

To enable additional tools we can enable the editor. This enables us to add or delete features, as well as add and delete fields. We can also click on an individual cell's content to edit its information, or for a selection of features we can use the Update Field Bar, specifying the field to update and the new attributes to update to – in this case clicking Update Selected. If we wanted to retain these changes we can save them, but in this case – since we want to keep our attribute table uniform - we'll just discard the changes and clear our selection.

To open the Layer Properties box of a layer we can right-click it and select Properties or simply double-left click within the Layers Panel.

The Layer Properties box contains various tabs which both summarize information and provide additional functions.

The Information tab summarizes the spatial characteristics as well as some of the attribute information within a dataset.

In the Source Tab we can rename a layer as we did with the Census Subdivisions. We can also use the Query Builder to filter features. However, this would filter the geometries of the layer in the Canvas as opposed to the table when using the Field Filter earlier.

The following four tabs are for visualization. We'll explore the Symbology and Labels tab in an upcoming demo, where we can apply different symbology styles to visualize fields within the attribute table, as well as differing labelling schemes. We can create Diagrams with the attribute information and, when enabled, also apply 3D visualizations.

The Source Fields tab provides more information on the Field Names, Types and additional parameters and with editor enabled we can add or delete a field, as well as rename a field.

So the Joins tab enables you to link datasets together – tables or vectors, by a field with common entries. The tab specifically works for one-to-one joins. So for example, here we could join the Census division and Subdivision layers using the unique Census Division identifier field. If we want to remove our join, simply select it and click the minus icon.

The final tab I'd like to cover is the rendering tab where we can apply a scale-dependent visibility, defining the minimum and maximum scale at which a dataset should begin or suspend rendering. We can set the scale from the drop-downs or set it to the current map canvas scale by clicking on this icon. This is helpful for large or highly detailed datasets that take a long time to render. Now, clicking OK, if we zoom in – our layer remains visible, but zooming out beyond the specified scale, you can see rendering is suspended.

Congratulations everyone! Today you've learned key skills in exploring, selecting and filtering features within the attribute table, performing simple edits and the use of some tabs within the Layer Properties box. In the next demo, we'll cover procedures for creating vector datasets, which includes delineating features and populating their attributes.

(The words: "For comments or questions about this video, GIS tools or other Statistics Canada products or services, please contact us: statcan.sisagrequestssrsrequetesag.statcan@canada.ca" appear on screen.)

(Canada wordmark appears.)

Video - Interacting with data in the Map Canvas

Catalogue number: Catalogue number: 89200005

Issue number: 2020004

Release date: February 17, 2020

QGIS Demo 4

Interacting with data in the Map Canvas - Video transcript

(The Statistics Canada symbol and Canada wordmark appear on screen with the title: "Demo 4 - Interacting with data in the Map Canvas")

Now that we have learned to load and order our datasets in QGIS, let’s explore some tools for interacting with them in the Map Canvas, particularly those found on the Map Navigation and Attribute toolbars. The skills covered today will provide skills for changing and saving the extent, as well as identifying and selecting features from layers in the Map Canvas.

So picking up where we left off…

The Map Navigation toolbar contains tools for changing the scale of the Canvas. By default the Pan Map tool is engaged. Simply left-click and drag the Canvas in the direction of interest.

The Zoom Tools operate similarly, left-click and drag across the area you’d like to zoom to. Depending upon the size of the box that’s drawn determines how much the scale changes. So if we draw a large box the change is negligible, whereas a smaller box the change is much more substantial. Alternatively you can use the scroll-bar of your mouse, scrolling backward and forward to zoom out and in respectively.

If we want to return to the extent of all active layers in the Panel we can use the Zoom to Full tool – helpful when we can’t find a particular dataset or if we just want to return to the full extent.

The Zoom to Layer tool is useful when the extents of loaded datasets differ

and works on the selected layer in the Layers Panel. Applying it to the road segments layer, it zooms to Manitoba, the area for which we downloaded the dataset.

The Zoom Last and Next tools are effectively the Undo and Redo of changes in the Canvas, enabling us to scroll through our previous zooms.

If you are going to be focussing on one area quite a lot for analysis or visualization, you can add a Spatial Bookmark and provide the bookmark with a name. Then if we were to close the Panel and zoom to another area in the map canvas, we can reopen the Panel, select the bookmark and the zoom icon to return to the saved extent.

Just before moving on to the Attribute Toolbar lets discuss grouping layers. We can use the Shift and Control keys to create a selection of layers, then right-click and hit Group Selected. This has many applications such as grouping thematically related layers, preparing mapping groups or organizing datasets such as toggling off many layers at once. Within the group, individual layers can be toggled off and on as normal. We can also right-click to Move a layer out of the Group or drag and drop – as desired.

Now on to the Attribute toolbar – which as the name suggests contains various tools for selecting, editing and examining the attributes of active layers in the Layers Panel. Today we’ll use the Interactive Selection and Identify tools, which default to the selected layer in the Layers Panel.

So with the Census Division layer selected, we can zoom in and left-click to select individual features. We can also drag across to select multiple features. Using Control we can add and remove individual features, or remove a selection of features. Alternatively we can use Shift to add many features to the selection. We can click the Deselect Icon on the toolbar to remove the selection.

If we expand the drop-down there are alternative selection options:

Select by Polygon is helpful for selecting irregular shaped features. We can left-click to add individual vertices and right-click to complete the polygon.

There is also Select by Radius, where we can zoom in, left-click a point of interest and left-click again when satisfied with the radius. Alternatively, we can specify the radius value in the top-right corner.

The Identify tool operates in a similar fashion. We can click an individual feature, and as we can see the Identify Panel returns information on both the geometry and attributes of the identified feature. Similar to the Interactive Selection tools we can drag across to identify multiple features and use the Collapse and Expand All icons to rapidly examine their attributes. Re-enabling the Census Subdivision layer, we can right-click and select Identify All. Here we returned two division features and six census subdivision features.

The same options from the interactive selection tool are available in the Identify tool by expanding the drop-down icon in top-centre of the Panel. Additionally we can change the Mode to alter which layers features are returned by the tool. Changing from Current to Top-Down will identify from all active layers. So re-enabling our grouped layers and creating a small selection in Northern Ontario we’ve identified a few features within the hydrological layer and ultimately returned features from three separate layers.

To remove the identified features click the Clear Results icon within the Identify Panel.

So that summarizes some of the basic tools for changing the extent and scale of the map canvas as well as interacting with vector datasets in the map canvas. In the next demo we will explore additional information found within the Attribute table and Layer Properties box of vector datasets.

(Canada wordmark appears.)

Video - Loading and Ordering Spatial Data in QGIS

Catalogue number: Catalogue number: 89200005

Issue number: 2020003

Release date: February 17, 2020

QGIS Demo 3

Loading and Ordering Spatial Data in QGIS - Video transcript

(The Statistics Canada symbol and Canada wordmark appear on screen with the title: "Demo 3 - Loading and Ordering Spatial Data in QGIS")

Hello everyone! So now that we've downloaded QGIS and spatial data, today we'll learn how to load and order datasets of different geometry types in QGIS, and save the Project for later use. For the demonstration we'll use several datasets that we downloaded in the previous video, covering the main geometry types of vector data: points, lines & polygons.

So the first step is to open QGIS Desktop from a desktop shortcut or from the start-bar.

And the first thing well do is pin QGIS to the taskbar since we will be using it frequently in subsequent training videos.

When you open QGIS for the first time it looks like this.

To load spatial data into QGIS, they are added from the Browser panel, in to the Layers panel, and also visualized in the Map Canvas.

So the first thing we need to do is expand the folders to find where we downloaded our spatial datasets to in the previous video. So I'll expand the Home folder and the Documents folder to find the GeospatialData folder.

Since it's the first time we're locating this folder, we'll right-click and add it as a favourite, which adds it to our favourites drop-down at the top, which will help us load datasets more quickly and easily in the future.

To see the available layers just continue expanding the folders, and within the Intro Demo folder there are 4 shapefiles and 2 geodatabase files.

So to load datasets in it is quite simple, you can just double-left click or drag-and-drop from the Browser to the Layers panel.

These procedures can also be applied to geodatabase files, you just need to expand the folder to see the available layers first. For Grain Elevators there is only one, so just double left-click, while for Transport Features there are many, but for the purpose of the demo we'll use the Road Segments layer.

Finally we will load in our two census boundary files into the layers panel.

Don't worry if the colours of your files differ from those in this video. QGIS assigns a single random colour when vector datasets are loaded.

So within the Layers Panel, individual layers can be toggled off and back on again, as well as renamed. So here I'll just rename the Census Subdivisions file with a more intuitive name.

Despite having loaded the six layers into the Layers Panel, we can only see one within the Map Canvas. This is because the order within the Layers Panel affects the order that they are rendered in the Map Canvas.

So in general points are placed above lines, which themselves are placed above polygons. For vectors of the same geometry type it is important to think about their position in the landscape relative to one another – so do rivers flow over roads, or do roads tend to get built over rivers? Well often roads are built over rivers, so we'll just switch their order in the Layers Panel. And similarly, the Lakes and Rivers polygon, as a land-cover feature we'll place it above the census boundary files.

So now if we zoom in we can see that all of our layers are visible in the Map Canvas.

The final component of the video I'd like to discuss today is saving the project for later use. This will save the order of layers in the Layers Panel, any visualizations styles such as labels or colours as well as any joins– all procedures that we'll discuss in later demos. So navigate to the Project Toolbar and click on the Save Icon. In general we want to store the project in the same location as the spatial data, and provide it with an intuitive filename, like Loading and Ordering Spatial Data.

So that concludes the procedures for loading datasets into QGIS from the Browser to the Layers Panel, which will work for most spatial data, and how to order them in the layers panel for their visualization in the Map Canvas. Additionally, we learned how to save our project and the specific properties that are retained. Stay tuned for the next demo, where we will explore some of the tools on the Map Navigation and Attribute toolbars for interacting with these datasets in the Map Canvas.

(Canada wordmark appears.)

Video - Downloading Spatial Datasets from Open Maps

Catalogue number: Catalogue number: 89200005

Issue number: 2020002

Release date: February 17, 2020

QGIS Demo 2b

Downloading Spatial Datasets from Open Maps - Video transcript

(The Statistics Canada symbol and Canada wordmark appear on screen with the title: "Demo 2b - Downloading Spatial Datasets from Open Maps")

So now that we have a better understanding of spatial data let's go to the Open Maps website and download some datasets to use in QGIS. Specifically we'll download 3 datasets, then using the skills from the video you can isolate and download the remaining files shown at the end of the video.

Open Maps is the integrated federal archive for spatial data making it a one-stop-shop for downloading thematically diverse datasets – from broad to highly specific content, which should enable most processes or features of interest to be examined. It is important to note that not all datasets are inherently spatial, but most have traits that enable their integration and use in GIS.

To begin searching for datasets scroll down the main page, and click the Open Maps link.

This brings us to a page with a search-bar where we can search specific datasets. The first one we'll look for is the Transport Features released by Natural Resources Canada as part of their CanVEC catalogue.

So the search results appear as such with a hyperlinked title to the main dataset page, a description of its content, the organization associated with its release and the file formats for the different components of the dataset.

At the moment datasets must be downloaded individually.

At the top right of the page are Search Filters to help refine the results. We'll apply two to find our dataset right now. Scrolling down to the Format Filter we'll check the FGDB box since we are looking for a File Geodatabase. In the Organization filter we'll click on Natural Resources Canada.

So we can click on the title to bring us to the main page of the dataset. The Resource Type column indicates the different components of the dataset, such as web services, classification guide or the main dataset itself,

in this case provided in Shapefile or File Geodatabase format. The file formats of spatial datasets and supporting information are varied between entries.

So to download the dataset we'll click the Access tab beside FGDB. This brings us to the main index page which hosts the available datasets. All are listed as Canvec - followed by abbreviations for the scale and geographic location and all are part of the Transport series. So national datasets vary in resolution from 1 in 1 million to 1 in 15 million, whereas provincially subset datasets vary in resolution from 1 in 50,000 to 1 in 250,000.

In general, you should use the dataset that matches your intended scale of analysis and visualization. So using the finest resolution data for a national examination of transport features or using the coarsest resolution for local assessment would both be inappropriate.

We'll download the 1 in 50,000 dataset for Manitoba.

Closing the index page, now I'd like to quickly show the classification guide. Classification guides contain information to help interpret and use a dataset. In the Catalogue drop-down we'll select Transport – the dataset we downloaded. It defaulted to 1 in 50,000 so we can just scroll down to a layer of interest. And within the table, we can expand a field we'd like more information on. So expanding the Road Class drop-down it provides the numeric IDs within the Attribute table, as well as the corresponding class and a detailed description of each class. We'll use this guide in a later demo to help classify our road segments.

So now we can close up the page and hit back. The first thing we'll do is remove the filters we applied earlier so they don't impact our next search results.

The next dataset we'll look for is the Annual Crop Inventory, a thematic raster released by Agriculture and Agri-Food Canada. This is a great resource for local assessments of crop variations both spatially and over time within Canada.

So scrolling down we can see each entry associated with a particular year, and then further down there is one without a designated year. We'll click on this link.

Once on the main page we'll scroll past the web-mapping services

until we reach the main components of the dataset. We'll download the Classifications Guide, to help us interpret the crop classes associated with the different numeric values.

If we wanted more information on the sources and methodology used to create the dataset we can access the metadata guide. In this case providing us information on the remote sensing datasets, methodologies, as well as the resolution and some accuracy assessments.

Now to access the main dataset, we'll click on the Access tab beside GeoTIF.

As we can see, the entire time-series is listed on this one integrated index, which would make for quicker downloading of a time-series than were we to click on the individual links in the original search results. We'll select 2017, and once again download the subset dataset for Manitoba.

Now the final dataset we'll look for is the land-cover circa 2000 file. So look-up land-cover within the Search Bar and hit search. With 271 returned records, we'll once again scroll down to the format filter, expand it and in the expanded options we'll select shapefile.

Once again it's our first returned result so we'll click the hyperlinked title. On the main page we'll scroll-down and click on the shapefile dataset.

Once again this is a subset dataset, but less intuitive than the provincially subset datasets we just downloaded. However, it is in the National Topographic System, a common referencing system for federal spatial data. So just look up NTS Index Canada in a new tab, and click the Open Canada link to access the reference guide which will help us isolate the files that correspond with our area of interest. There are a variety of formats we could use to find the files of interest, but for now we'll use the .pdf file and click on the prairies. This is how the system appears. Each areas is referenced by a large number and – zooming in – they are also subset by specific letters. So if we wanted to download the land-cover for Winnipeg we could download 62 G, H, I and J. Closing up the Reference Guide and returning to the main data index page for the dataset, scroll down to 62 and then select G, H, I and J. This system is also used for the Digital Elevation Model datasets in Table 1.

I'd also like to quickly discuss downloading the Statistics Canada Census Boundary files, which were accessed from Statistics Canada's website. So look up Statistics Canada Boundary Files in a webpage and click on the on the link. There are two styles of boundary files. There is the cartographic boundary file, which include shorelines, islands and other land-components and are best are used for visualization of data. And the Digital boundary file in which there is one feature for each corresponding boundary. These are best suited for processing and analysis.

Now that we know of the two different styles, we can click on the 2016 link to access the most recent boundary files. So we can select the specific style and level of interest. We'll start by downloading the digital boundary file for Census Divisions, clicking on Continue and then selecting the hyperlinked text to access the dataset. Then we can access the Census Subdivision Cartographic file, selecting the corresponding level and style, clicking on continue at the bottom of the page and the zipped dataset link on the next page. So repeat these procedures as necessary to download the remaining boundary files and styles, as well as the Lakes and Rivers Polygon and Rivers line datasets shown in the table at the end of the video.

While we are here we will also download the table datasets listed in Table 1.

First we'll download the Population and Dwelling Highlight Tables. So click on Data, and look up Highlight. The first returned result is the compiled highlight tables from the 2016 census. Listed are the various highlight tables, ours is on Page 2 and we'll click on the Population and Dwelling Count highlights. We'll then download the complete geographic level by clicking on the CSV/TAB hyperlink in which we can then download the Census Divisions, Subdivisions and Tracts.

The next table dataset we'll look for is the Farms Classified by Total Farm Capital table. Much like Open Maps there are various filters we can apply to help isolate the dataset, so let's add an Agriculture filter and search Farm Capital. Scrolling down it is around the 8th result, so once found click on the hyperlinked title. On the main page it has provided the dataset by default for Canada. We can click on the Add/Remove Data tab to change the geography levels, as well as the reference period, and choose the variables we want to download.

So let's expand the Geography levels to show how to select different boundary levels. To download a complete geographic level we can use the boxes at the top – clicking on the box furthest to the right to download the finest resolution at the subdivision level or in this case we'll select the Census Agricultural Regions– the third box. We could also expand and select a specific area of interest within the drop-downs, clicking all or on an individual feature. Re-enabling we'll just select all Agricultural Regions and toggle Canada off.

Then we can go over to our variables. In this case, by default all the variables are enabled – which is not always the case so it is always good to verify and select the variables that are of interest to you. If a dataset has been collected over multiple census periods, you can specify the Reference period to include in the table from the drop-downs here.

And the Customizable Layout tab lets you format the dataset according to your particular use of interest. For us we'll simply change the Geography from Columns to Rows.

Then we'll click Apply. Once the formatting has been applied to the table below we can select Download Options. We'll Download As Displayed to retain the formatting specified. If we were to Download the Entire Table it would remove those formatting specifications.

The final procedure I'd like to discuss is extracting our datasets to a common folder. In GIS it is best practice to store all your datasets in one common directory. So we'll expand the Documents and create a new folder called GeospatialData. Within this folder we can use additional subdirectories be organized by project or theme to help organize your files. Before hitting extract, we will copy the directory so we can paste it when repeating the procedure with other downloaded datasets. Repeat with the remaining files.

Congratulations! From today's demo you've learned foundational skills to navigate the Open Maps Platform, download and take full advantage of the diversity of spatial data it stores. Many of these skills can be extended to accessing datasets from other geospatial archives, such as those hosted by municipal and provincial governments. We also covered the process of extracting and storing datasets in a common directory on your computer, with subdirectories to help organize the different datasets. With this experience you should be able to isolate relevant data and file formats for your own work activities. In the following tutorial we'll cover the procedures for loading and ordering datasets in QGIS.

(Canada wordmark appears.)