Language selection

Search

BAI performance measurement framework

A Progress Report on the BAI PMF Pilot, May 2019

About this Report

In Budget 2016, the Government of Canada announced its commitment to work with stakeholders to develop a performance measurement framework for business accelerators and incubators (BAIs) in Canada. The first step toward increased collaboration among BAIs in Canada to create a national solution for data collection and performance reporting took place on February 10, 2017 in Toronto, Ontario. Leaders from 18 BAIs discussed opportunities and challenges of creating a national performance measurement framework.

A BAI Steering Committee consisting of a representative group of BAI leaders and policy makers was subsequently formed to continue an inclusive national discussion and provide leadership in crafting a national performance measurement solution that works for the BAI community and its partners in government. The Committee's overarching mandate has been to work alongside the Government of Canada in partnership to a) establish a performance measurement framework, and b) pilot a performance measurement platform for BAIs. The primary output of the Committee's work was a Performance Measurement Framework (PMF) launched in February 2018.

The pilot remains ongoing, with data collection for the year 2017 now complete. Beginning in January 2019, the project partners and the participant community began gathering feedback on the initial stages of the pilot process. Part I of this report provides a synthesis of feedback received from government partners and industry stakeholders, documents the progress achieved to date, and catalogues some ongoing challenges that will be addressed as the pilot progresses. The feedback and learnings from the pilot, in turn, have been consolidated and integrated into an updated version of the performance measurement framework – termed "PMF 2.0." As a next step, this new and refined version of the framework will be implemented for data collection in year 2 of the PMF pilot, alongside a national roll-out.

In addition to presenting an updated survey instrument with clear definitions for the key performance metrics, this document serves as an operating manual for the BAI performance measurement process. As such, part II of the report provides the necessary background for onboarding new BAI participants and government partners, including the initial rationale for establishing a national performance measurement framework and a simple logic model that guides the design of the PMF. The report describes the approach for collecting, analyzing and reporting the data, including the methodology that will be used by approved researchers to produce the descriptive statistics and econometric analyses that will illuminate the relationship between BAI programs and the economic performance of client firms. Finally, the report details the operations and administration of the performance measurement platform, including processes for obtaining consent to share information and protecting the confidentiality of data.

The report was authored by Anthony Williams, president and co-founder of the DEEP Centre, with input from the project partners and participants in the BAI PMF.

PDF version

Table of contents


A Progress Report on the BAI PMF Pilot

PART I. Mandate and Progress to Date

Public sector organizations have long recognized the need to develop and deploy performance measurement systems to ensure that they have timely, strategically focused, objective and evidence-based information on their performance, in order to produce better results and remain high-performance organizations. Nowhere is this arguably more important than when it comes to ensuring that public investments in innovation and economic development establish a robust foundation for developing the companies, jobs and industries of the future. With this goal in mind, the Department of Innovation, Science and Economic Development (ISED) has been working with a representative group of policy leaders and stakeholders to develop a performance measurement framework for business accelerators and incubators (BAIs) in Canada. This framework will enable companies to choose their best options for support, help BAIs to benchmark their performance and improve their programming, and assist governments at all levels to increase the effectiveness of public investments in this area.

To date, the work of the BAI community and its partners in government has resulted in the development of a standardized reporting framework that establishes consistent definitions for job creation, revenue generation, investment and other outcome-related metrics; a common performance measurement platform that streamlines the collection, analysis and reporting of data; and a pilot process that was launched in early 2018 that provided a representative group of BAIs with an opportunity to test and refine the framework before rolling it out on a national basis. Other deliverables to date include a set of agreements to govern the collection and reporting of client data and an agreed process and methodology for analyzing the economic impact of BAIs.

This report is intended to guide the BAI community and its partners in government as they proceed with the next phase of building a national performance measurement solution. It presents an updated Performance Measurement Framework (termed BAI PMF 2.0) which will form the basis of a national performance measurement solution, documents the progress achieved during the pilot process, and reflects the key decisions and design inputs of the pilot participants and public sector partners. Specifically, the report:

1.1 Rationale for a National PMF

What qualifies as success for start-up assistance organizations? And how should a national framework measure this success? PMF participants and public sector partners have broadly agreed that the essential measures of success for BAIs are linked to the growth and competitiveness of incubated/accelerated firms. If incubators and accelerators are successful in selecting and nurturing promising business ideas, incubated firms, on average, should enjoy higher survival rates, grow faster, employ more people and attract more capital than a comparable cohort of non-incubated firms.

Leading incubators and accelerators collect data to document these outcomes because the data tells a powerful story. They can use it to hone and improve their programming, to benchmark their performance, and to market their achievements to clients, funders and stakeholders.  More broadly, performance data can illuminate the important role BAIs play in nurturing growth-oriented technology firms—firms that will help generate the jobs and innovation to fuel Canada's economic prosperity. In fact, all concerned stakeholders—BAIs, their current and prospective clients, and their funding partners in government and the private sector—stand to benefit substantially from the ability to document this value creation in a credible and consistent way.

In designing the measurement framework to capture the economic benefits that BAIs create, however, it became clear that the various stakeholders – including diverse BAIs across Canada – have differing mandates, objectives and clientele, and therefore prioritize different outcomes and measures of these outcomes. And while majority of BAIs participating in the pilot project already collect performance data, most are measuring their performance using a diverse and (often) inconsistent range of metrics and with widely varying levels of success in obtaining data from their clients. 

A broad consensus has now emerged that a national performance measurement framework for Canada offers a better way forward. There is growing agreement that a common subset of the metrics that BAIs track – while not exhaustive – provide a reasonable starting point for defining success. Moreover, BAI leaders understand that a national framework for performance measurement could streamline reporting requirements and generate a range of other key benefits for the BAIs, their clients and the start-up ecosystem as a whole.

The benefits envisioned include:

With these benefits in mind it is worth clarifying how the PMF will be used to inform policy and funding decisions. The purpose of a standardized national measurement framework is to generate consistent and reliable data about the economic impact of BAIs, for the benefit of BAIs, companies seeking BAI support, and governments that fund BAIs. Analysis performed using the data collected from BAI clients during the pilot period – including the production of descriptive statistics and econometric modelling using linked datasets by ISED, Statistics Canada and/or approved researchers – will not be used to evaluate the performance of individual BAIs. It will, however, be used to inform robust conclusions about the role BAI programs play in firm growth and how to most effectively support innovative growth-oriented firms in Canada. For policymakers in particular, the objective is to use the PMF to evaluate the overall effectiveness of national funding programs, identify policy gaps and frame responses that boost the performance of Canada's business support ecosystem.

With respect to future funding applications for individual BAIs, it is expected that BAIs will present their performance data in a manner consistent with the PMF and, when applicable, use the framework to report their performance against specific program funding they receive. In doing so, BAIs and their partners in government can achieve greater alignment on reporting requirements and eventually reach a point where BAIs can enter data points once for multiple audiences and purposes.

In the interest of enabling fair and effective funding and policy decisions, BAIs will need to work closely with governments and other funding partners to interpret the data collected through the PMF. What constitutes high performance for BAIs will always be subject to variations across regions (e.g., population densities, funding models, and proximity to complementary business support services), sectors and level of ecosystem maturity, among other things. It is incumbent upon all stakeholders to recognize that while performance benchmarks across ecosystems are useful, caution should be exercised to ensure that data is interpreted using a sophisticated and nuanced approach that takes context into account. For this reason, the pilot process was designed to enable a representative group of BAIs and policymakers to incrementally test, evaluate and refine the processes for data collection, analysis and reporting to ensure that the PMF informs fair and effective decision-making by all relevant stakeholders.

1.2 Designing the Performance Measurement Framework

The process for developing a performance measurement framework for BAIs in Canada has unfolded over three stages to date, with phase I of the pilot stage (stage 3) having been completed in March 2019, and a second phase of the pilot planned for the remainder of 2019. The key stages are detailed below.

Table 1: Activity Breakdown – BAI Engagement, Feasibility Study, Pilot and Rollout Phases
ACTIVITY DETAILS
1. BAI Engagement & National Dialogue (Fall 2016 to March 2017)
  • Convened a national discussion on best practices in performance measurement.
  • Consulted BAIs on their willingness to develop a national performance measurement framework.
  • Hosted a national dialogue to share best practices on BAI performance measurement.
  • Enlisted a small, but representative group of BAIs to proceed with a feasibility study.
2. Steering Committee & Feasibility Study (April to Dec 2017)
  • Established a BAI Steering Committee and working groups.
  • Forged agreement on standardized metrics, measurement tools and platform.
  • Conducted a test run of data collection and reporting with Hockeystick platform.
  • Reported on PMF progress and learnings to date.
3. Pilot Program Phase I and II (March 2018 to March 2020)
  • Enlist a nationally representative group of BAIs to participate in the pilot.
  • Pilot the performance measurement framework and data collection process over two collection and reporting cycles (i.e., 2017 and 2018 BAI program cohort/entrant data).
    • March – Dec 2018: BAIs completed their organizational profiles and uploaded their 2017 program cohort/entrant data.
    • March 2019 – March 2020: BAIs will upload 2018 Q1–4 BAI program cohort/entrant data for those with a quarterly reporting cadence. BAIs that collect data on an annual cycle will upload their 2018 data in Q1–2 of 2019.
4. Pilot Phase I and II Evaluation and Reporting (February to April 2019 and February to April 2020)
  • Produce analysis and BAI performance report.
  • Identify opportunities, challenges and tips for managing the data collection and reporting process with expanded number of BAI participants.
  • Calibrate performance metrics and processes based on insights and lessons learned from the pilot.
  • Gather feedback on the suitability of the data sharing platform for subsequent phases.
5. National Rollout (April 2020 and beyond)
  • Make necessary adjustments to the PMF and platform.
  • Further encourage federal and provincial government programs to adopt the metrics defined in the PMF to assess BAI programs
  • Formalize governance/stewardship.
  • Recruit additional BAIs to participate in the national rollout.
  • Continually monitor and make adjustments to the PMF to ensure it remains useful to its stakeholders.

1.3 Key Outputs from Phase I of the BAI PMF Pilot

Phase I of the pilot conducted between March 2018 and March 2019 provided the BAI community and its partners in government with the opportunity to test a common performance solution by uploading an initial set of data to a shared data collection platform. The objective of this exercise was to assess the appropriateness of this metrics list for a national performance measurement solution – including the clarity of questions, number of questions, and areas for analysis – and to refine the processes for data collection, sharing, and analysis.

Along the way, ISED and the participant group reflected on the broader purpose and objectives of a national performance measurement framework and problem-solved a variety of technical and operational issues that arose during the pilot. What follows is a brief overview of some of the key outputs from Phase I of the pilot.

Defining a measurement framework. Among the first tasks for the Steering Committee was defining a common set of performance metrics and complementary survey instruments for data collection. An updated version of the metrics framework is outlined in section 2.2 (performance metrics) and in appendix A and appendix B (the questionnaires for data collection). A consensus was reached that the performance metrics for Phase I of the Pilot would focus on a core set of financial indicators linked to the annual revenues, employment, capital raised and the intellectual property portfolio of client companies. A key challenge going into the pilot was arriving at common definitions for these indicators. While BAIs generally track the same outcomes (e.g., client revenues, employment and investment), they do so using different methods and differing indicators. Considerable time and effort went into crafting acceptable definitions for each metric. For example, how to define a job and which number of hours constitutes full-time employment; how to parse differing types of investment capital; and whether to track indicators using a calendar year or fiscal year. Other challenges included defining an approach to collecting information about founder demographics and establishing a common industry/sector list to ensure that both BAI programs and client firms could be categorized the same way. The effort to reach consensus on these issues ensured that BAIs participating in the pilot tracked the same indicators using the same definitions and methods over the same time period. However, the pilot experience highlighted the need to further refine the framework to both streamline the questionnaire and clarify or modify the definitions for certain metrics.

Selecting and operationalizing the data sharing platform. With respect to data collection and storage, it was determined that the pilot data would be aggregated into a secure central platform that complements established platforms/processes used by mature BAIs. For BAIs, the data platform solution had to be low-cost, secure, convenient and useful. For BAIs with existing CRM solutions and data management systems, it was also important that participation in the pilot would not require BAIs to transition to a new platform. For the federal government partners, it was important that the platform provided a secure and trusted environment for hosting and visualizing data on servers located in Canada, as well as a secure data export function to provide approved analysts with convenient access to data. With these criteria in mind, ISED and the BAI Committee considered several available data platforms and, for a variety of reasons discussed further in part III, chose Hockeystick as the data sharing platform for the pilot. Hockeystick subsequently worked to customize its platform and interface for purposes of collecting, aggregating and exporting client data with the group of BAIs participating in the pilot.

Designing an approach to data collection and analysis. Having defined metrics and selected a platform, the BAI Steering Committee and its partners in government proceeded to work on designing the data collection process and determining how data collected during the pilot would be analyzed. The details of the data collection process are outlined in part II, however, it was determined that pilot data would be collected from BAI client companies for two full reporting cycles covering calendar years 2017 (pilot year one) and 2018 (pilot year two). With regard to data analysis, the Committee agreed on three fundamental points. First, that the purpose of the data analysis is to draw robust conclusions about the economic impact of BAIs, for the benefit of BAIs, companies seeking BAI support, and governments that fund BAIs. Second, in the interest of enabling longitudinal analysis, strict protocols to protect client confidentiality will be followed to link the data collected to Statistics Canada and other Government of Canada sources using client names and business numbers. Third, that data analysis should be performed by a reliable, committed party capable of consistent interpretation of data. For the purpose of the pilot, it was determined that ISED will manage the data analysis and reporting process and work in partnership with Statistics Canada approved researchers.

Recruiting participants and preparing for the pilot. With the platform selected and metrics defined, the BAI Steering Committee and its partners in government invited BAIs from across Canada to participate in a one-day forum in Toronto in February 2018 with the objective of educating BAI leaders about the pilot process and obtaining their input in order to make final adjustments to the metrics framework and the processes for collecting and analyzing performance data. Key challenges for the meeting included tweaking the metric definitions and survey instruments based on the input from the participants; working with Hockeystick to ensure the platform would meet the needs of BAI participants; reviewing precisely how and with whom client data will be shared; determining how and when to obtain client consent to aggregate data using Hockeystick and to share data with the federal government for the purpose of research; and communicating the rationale and objectives of the pilot to the broader BAI community, while enlisting a larger group of organizations to participate. Following a successful conclusion to the meeting, there was broad enthusiasm across the BAI community for participating in phase I of the pilot.

Collecting and uploading 2017 performance data. Following the February 2018 meeting, thirty BAIs formally agreed to participate in the pilot process. These organizations were first onboarded onto the platform (Hockeystick) and asked to complete their BAI profiles, which includes information on the programs for which they subsequently submit client data. Only a portion of the BAIs that agreed to participate in the pilot – approximately 2/3rds – were able to submit client data for the 2017 calendar year. This was due primarily to: a) having missed the 'window' to collect 2017 data as part of their annual collection processes and; b) not having the requisite consent agreements in place to share previously collected data. Notably, in an effort to reduce respondent burden, ISED and the project partners did not ask organizations to go back and collect new data for the 2017 period or obtain consent to share data that had already been collected from companies. Instead, BAI participants were asked to implement the PMF survey on a go-forward basis, starting in the next calendar year. Throughout the remainder of 2018, the Steering Committee co-chairs provided direct assistance and support to BAIs to encourage survey completion and problem-solve implementation challenges. The final deadline for submitting 2017 data was Thursday November 15th.

Analyzing the performance data. In July of 2018, a meeting was held with Statistics Canada, ISED and a working group of BAI pilot members. The meeting provided an opportunity for BAIs to better understand the importance of data confidentiality at Statistics Canada, the agency's process for analyzing confidential micro-data (along with its process for approving external research projects using the data) and the measures it deploys to maintain the confidentiality of the data it holds about Canadian companies. Feedback from participants indicates that this meeting raised the level of confidence BAIs felt in contributing their client data to the pilot project and enabled BAIs to better reassure their client companies that the confidentiality of their data would be protected (see discussion in Section 2.3).  By November 2018, 20 BAIs had submitted data for the first year of the pilot. This yielded 699 company-level records that included company identifiers that could be used by Statistics Canada to link the data with administrative datasets. Submissions that did not include identifying information required to link the datasets within the secure environment at Statistics Canada were necessarily discarded. The micro-data was subsequently submitted directly to Statistics Canada for analysis by approved researchers. The preliminary findings from this analysis were presented at the BAI PMF Mini-Summit on February 20, 2019 in Waterloo. Highlights from the findings are presented in section 1.4.

Gathering feedback on phase I of the pilot. During the course of phase I of the pilot, ISED gathered feedback from both pilot participants and other federal partners to inform this release of the "2.0" version of the BAI PMF framework. The February 20th 2019 mini-summit in Waterloo also provided an opportunity for BAI participants to provide their feedback on the pilot experience, including the survey instrument, the process for collecting and analyzing performance data, the implementation support provided by ISED and Hockeystick, and potential future direction of the performance measurement framework. Finally, a short survey was administered by Chris Diaper of TEC Edmonton to solicit input from BAIs that participated in the phase I of the pilot. These various sources of feedback and input have been synthesized and are summarized in section 1.5.

Refining the measurement framework. The final step in phase I of the pilot was to implement changes to the measurement framework and data collection process to respond to issues either observed during the pilot or raised by BAIs in the subsequent feedback session. Most of the changes reflected in version 2.0 of the PMF were implemented to simplify and streamline the survey instrument, to lessen the data collection burden on BAIs and client companies and, in some instances, to clarify questions that had caused confusion or ambiguity. These changes are reflected in this document and will be operationalized for phase II of the pilot.

Date modified: