Operating Support Program Evaluation (2011-12 – 2017-18)
Final Evaluation Report

June 2023

At the Canadian Institutes of Health Research (CIHR), we know that research has the power to change lives. As Canada's health research investment agency, we collaborate with partners and researchers to support the discoveries and innovations that improve our health and strengthen our health care system.

Canadian Institutes of Health Research
160 Elgin Street, 9th Floor
Address Locator 4809A
Ottawa, Ontario K1A 0W9

This publication was produced by the Canadian Institutes of Health Research. The views expressed herein do not necessarily reflect those of the Canadian Institutes of Health Research.

Acknowledgements

Special thanks to all participants in this evaluation through end of grant reports and case studies, CIHR’s Research, Knowledge Translation and Ethics (RKTE) Portfolio – Program Design and Delivery, Operations Support Branch and Science Policy Branches, CIHR’s Financial Planning Unit, CIHR’s Results and Impact Unit and Nathalie Kishchuk of Program Evaluation and Beyond.

The OSP Evaluation Team

Shevaun Corey, Angela Mackenzie, Kwadwo (Nana) Bosompra, Kimberly-Anne Ford, Jenny Larkin, Hayat El-Ghazal, Alexandra Leguerrier (student), Sabrina Jassemi (student), Michael Goodyer, and Ian Raskin.

For more information and/or to obtain copies, please contact evaluation@cihr-irsc.gc.ca.

Table of Contents

List of Tables

List of Figures

List of Acronyms

Acronym Meaning
ARC Average of Relative Citations
ARIF Average Relative Impact Factor
CAHS Canadian Academy of Health Sciences
CIHR Canadian Institutes of Health Research
DORA San Francisco Declaration on Research Assessment
DRF Departmental Results Framework
ECI Early Career Investigator
FGP Foundation Grant Program
FPA Funding Policy and Analytics
HQP Highly Qualified Personnel
IIR Investigator-Initiated Research
NPI Nominated Principal Investigator
OECD Organisation for Economic Co-operation and Development
OOGP Open Operating Grant Program
OSP Operating Support Program
PDD Program Design and Delivery
PDR Priority-Driven Research
PGP Project Grant Program
PIP Program Information Profile
PREP Peer Review Expert Panel
RIU Results and Impact Unit
RKTE Research, Knowledge Translation and Ethics
RRS Research Reporting System
SCIO Subcommittee on Implementation and Oversight

Executive Summary

Program Overview

Approximately three-quarters of the budget at the Canadian Institutes of Health Research (CIHR) is used to support investigator-initiated research (IIR), which are projects created by individual researchers and their teams. At the time of the evaluation, the funding for IIR was provided primarily through the Project Grant Program (PGP) and Foundation Grant Program (FGP), which replaced the Open Operating Grant Program. All three programs made up the Operating Support Program (OSP). In addition to the OSP, funding is also provided through Tri-Council career, and training programs (e.g., Canada Research Chairs, Banting Postdoctoral Fellowships, and Vanier Canada Graduate Scholarships). Beginning in 2010, CIHR started the process of reforming its investigator-initiated programs and related peer review processes. However, this took place in the context of constrained funding and experienced implementation challenges and was met with mixed reactions from the health research community. The FGP and PGP underwent many changes and in April 2019 CIHR made the decision to sunset the Foundation Grant Program.

Evaluation Objective, Scope and Methodology

The objective of this evaluation was to provide CIHR senior management with valid, insightful, and actionable findings about the performance of the former Open Operating Grants Program (OOGP) as well as the relevance and design and delivery of the successor programs – the FGP and PGP. The evaluation covered the period from 2011-12 to 2017-18 and is the second evaluation of the OOGP. Evaluation findings were triangulated across a variety of data sources, including analyses of documents, data, and end of grant reports along with bibliometric analysis. The evaluation meets the requirements of the Treasury Board of Canada’s Secretariat (TBS) under the Policy on Results and the Financial Administration Act.

Given that CIHR’s OSP funding is currently only provided through the PGP, the recommendations were focused on this program. It is important to note that the evaluation was completed in fiscal year 2019-20, with the approval and web posting of this report as well as the development of the management action plan delayed due to the COVID-19 pandemic. It should also be acknowledged that a number of important  changes have taken place at CIHR since the completion of this report, most notably the implementation of the CIHR’s 2021-2031 Strategic Plan which has resulted in a number of key actions related to advancing research excellence, building health research capacity, and integrating evidence in health decisions.

Key Findings

Overall, the evaluation found that funding investigator-initiated research remains an effective means to support health research and build health research capacity. The following key findings relate to the relevance, performance, and design and delivery of the OSP.

The OSP addressed a continued need for investigator-initiated health research

Given the nature and extent of the investment in the OSP, CIHR is addressing the continued need for the investigator-initiated health research. The evaluation found that CIHR investments in the OSP are aligned with Government of Canada priorities, which are supported by the priorities of Canada’s Science Vision, the Fundamental Science Review, and the Federal Budget (2018 and 2019). Broadly, the OSP aligns with the CIHR Act, roles and responsibilities, and the strategic directions of Roadmap II (the strategic plan in place during the period under review), specifically promoting excellence, creativity and breadth in research.

The OSP contributed to advancing knowledge creation and building health research capacity

The evaluation found that the OSP has been attracting and funding research excellence, Specifically, OOGP-funded researchers and FGP and PGP applicants are more productive and impactful than health researchers in Canada and other OECD countries. The evaluation also found that OOGP funding across pillars, although majority are Biomedical grants, has successfully facilitated the creation, dissemination, and use of health-related knowledge (mainly within academia), as well as contributed to building Canadian health research capacity by increasing the number of researchers and trainees indirectly supported by these grants.

OOGP funded research results have demonstrated limited translation of knowledge beyond academia, longer-term health, and socio-economic impacts

Despite program objectives, as well as the objectives and priorities of the CIHR Act and strategic plan (Roadmap II), the evaluation found that less than half of OOGP grants involve and impact stakeholders beyond researchers and study stakeholders as reported by NPIs through end of grant reports. Similarly, less than 15% of OOGP grants resulted in the translation of knowledge beyond academia, longer-term health impacts, or socio-economic impacts.

CIHR needs to better define and align the objectives of the PGP in relation to the CIHR Act given that FGP has been sunset

The OSP Program has undergone many changes since the launch of the new programs under the reforms, with several elements not delivered as planned (e.g., reviewer matching software, College of Reviewers) and with noted implementation challenges (recommendations from the Internal Audit Consulting Engagement, the 2016 Working Meeting with Minister of Health, and the PREP). Despite broad alignment of the OSP with CIHR’s Act, the evaluation shows that the current objectives of the PGP lack alignment with the Act, specifically regarding building Canadian health research capacity. Capacity building was a specific objective for both sunset programs (OOGP and FGP). Given that the PGP is the only remaining investigator-initiated program, a review of objectives to ensure alignment with the Act is needed.

CIHR needs to improve monitoring and assessment of the outcomes and impacts of its investigator-initiated research

While there is a wealth of application, competition, and implementation data available for the OSP (e.g., surveys about the application and decision processes), there is currently a lack of output/outcome data being collected to assess progress toward expected outcomes beyond the end of grant report (which is only administered 18 months post grant expiry). Furthermore, the evaluation shows that there are concerns about the availability and reliability of the data from the current end of grant report (e.g., self-report; low response rates; variability in completion times; overall length, structure, and type of questions included); thereby, limiting the ability to accurately assess whether OSP programs are effectively achieving their objectives. Although CIHR is making advances in data governance, challenges with data ownership and management (i.e., multiple units are responsible for the collection and dissemination of data) further affect the ability to monitor and assess program performance.

CIHR needs to ensure funding decisions are made equitably

The evaluation showed that there are differences in funding and outcome characteristics by pillar, gender, career stage across individual OSP programs (OOGP, FGP, PGP) that need to be considered in the design and delivery of the PGP going forward. Although OSP funds researchers across pillars, the majority are from Pillar 1 (Biomedical). Male researchers have higher success rates than female researchers, and research shows sex and gender biases towards females and women in funding decisions specifically related to the OOGP and FGP. Early career researchers have lower success rates compared to mid- and senior career researchers and English versus French language applications are generally more successful. Recently, CIHR has taken steps to address inequities such as the equalization of success rates across career stages for the PGP. CIHR is committed to addressing any unconscious biases in its processes to ensure equitable access to research funding (e.g., CIHR’s Equity Strategy, Tri-Agency Statement on Equity, Diversity and Inclusion, Tri-agency Equity, Diversity and Inclusion Action Plan).

Recommendations

Given the programmatic shifts in the OSP, most notably the sunset of the FGP, the evaluation makes three recommendations aimed at improving the design and delivery and performance of the Project Grant Program.

Recommendation 1

CIHR should revise the PGP objectives to ensure they are clearly defined, fully aligned with, and support key aspects of the CIHR Act related to building Canadian health research capacity.

Recommendation 2

CIHR needs to ensure that investigator-initiated research funding is distributed as equitably as possible while minimizing the potential for peer review bias. The design and implementation of investigator-initiated grants must account for differences within the health community observed by the evaluation (e.g., pillar, sex, career stage and language) and well as in the research more broadly.

Recommendation 3

CIHR needs to improve the monitoring and assessment of activities and investments in investigator-initiated research.

  1. CIHR needs to enhance the way performance data is collected related to capacity building (e.g., indirect support of trainees), knowledge translation beyond academia (i.e., informing decision making), collaborations, health impacts, and broad socio-economic impacts to better understand the full impact of grant funding.
  2. CIHR needs to revise the current end of grant reporting template and process in order to improve the availability, accuracy, and reliability of the data collected.
  3. CIHR should consider additional ways to collect data beyond end of grant reports via interim reporting as well as longer term follow-up to assess impact.

Program Profile

Context

As stated in the CIHR Act, the CIHR mandate is to “excel, according to internationally accepted standards of scientific excellence, in the creation of new knowledge and its translation into improved health for Canadians, more effective health services and products and a strengthened Canadian health care system.”CIHR is the major Government of Canada funder of research in the health sector and classifies its research across four “pillars” of health research: biomedical; clinical; health systems/services; and population health. CIHR invests approximately $1 billion dollars in health research each year. This investment supports both investigator-initiated and priority-driven research.

CIHR classifies investigator-initiated research (IIR) as research where individual researchers and their teams develop proposals for health-related research on topics of their own choosing. Approximately three-quarters of CIHR's $1 billion budget are used to support IIR through its core programs (i.e., OOGP, FGF, and PGP) as well as through Tri-Council career and training programs (e.g., Canada Research Chairs, Banting Postdoctoral Fellowships, and Vanier Canada Graduate Scholarships). The balance is spent on priority-driven research (PDR), which refers to research in areas identified as strategically important by the Government of Canada; in this case, themed calls for research proposals are made.

Evolution of Investigator Initiated Programming at CIHR

Until 2014, the Open Operating Grants Program (OOGP) was CIHR’s primary mechanism used to support IIR. The specific objectives of the OOGP were to contribute to the creation, dissemination and use of health-related knowledge, and to help develop and maintain Canadian health research capacity. These objectives were targeted by providing support for original, high-quality projects or programs of research, proposed and conducted by individual researchers or groups of researchers, in all areas of health.

In 2009, CIHR’s Health Research Roadmap introduced a bold vision to reform the peer review and open funding programs. Beginning in 2010, CIHR started the process of reforming its investigator-initiated research programs, including the OOGP, and the related peer review processes (see Figure A: Timeline of CIHR Reforms Process, 2009-2017). These reforms were informed by three main lines of evidence:

  1. Data from an IPSOS Reid poll of the scientific community in 2010 conducted by CIHR that found strong support from the research community to fix a peer review system that was perceived as ‘lacking quality and consistency';
  2. A recommendation of CIHR's second International Review Panel in 2011 that ‘CIHR should consider awarding larger grants with longer terms for the leading investigators nationally. It should also consolidate grants committees to reduce their number and give them each a broader remit of scientific review, thereby limiting the load'; and,
  3. Findings from the 2012 evaluation of CIHR's Open Operating Grant Program (OOGP) which recognized challenges in open funding across pillars of research and supported the need to reduce peer review and applicant burden.

CIHR conducted several rounds of consultations with its stakeholder communities prior to and throughout the reforms process. The earliest consultations, which led to the proposed design, resulted in a number of challenges being identified with CIHR’s existing funding architecture and peer review processes. The resulting re-design was intended to address those challenges, some of which included funding program accessibility and complexity, applicant burden and “churn”, insufficient support for new/early career investigators, unreliability/inconsistency of reviews, and high peer reviewer workload.

CIHR moved to a new open suite of programs, which were piloted and implemented between 2010 and 2016. The majority of its IIR funding was now being awarded through its FGP and PGP. The objectives of the new open suite of programs are provided below.Footnote i However, the reforms, which took place in the context of constrained funding and experienced implementation challenges, were met with mixed reactions from the health research community. During this time, CIHR’s strategic plan was also updated (Health Research Roadmap II – 2014-15 to 2018-19) and continued the implementation of the reforms as part of the strategic direction focused on promoting excellence, creativity and breadth in health research and knowledge translation.

Monitoring and Review of the New Programs

Since 2013, CIHR piloted specific design elements associated with the new programs, including structured applications, remote review, a new rating scale, and a streamlined CV. These pilots were conducted in a ‘live’ manner (i.e., piloted during routine program delivery across several programs), so that CIHR could monitor outcomes in an evidence-informed manner and to safeguard the reliability, consistency, fairness and efficiency of the competition and peer review processes. CIHR intended the implementation of programs to be iterative, drawing on feedback from pilot studies with stakeholders and its own internal reviews. The results from the pilots were analyzed and consolidated into the following reports; however, an overview of the key points is provided below.

In 2015, CIHR commissioned a number of reviews to assess CIHR's internal systems and implementation processes, to allow for adjustments in a timely manner, given the complexity of the pilot projects and new programs.

CIHR’s Internal Audit Unit reviewed the reforms implementation as part of an Internal Audit Consulting Engagement (2016), which was focused on governance and administrative practices linked to project management and internal reorganization in order to deliver the new Foundation and Project Grant programs. The report noted that the reforms implementation project benefited from well-developed planning tools and that the pilots were rolled out on time. The report also indicated that there were opportunities for improvement in the areas of information-sharing, communications, reporting, project planning, and stakeholder engagement, all of which are being addressed by CIHR. The Reforms CRM Project Independent Third-Party Review by Interis Consulting was specifically designed to assess the implementation of the business systems required to support the new program delivery processes. CIHR sought expert advice to provide recommendations concerning implementing complex, transformative business systems. Results indicated there were opportunities for improvements included clarifying roles and responsibilities as well as project schedules and scope.

CIHR accepted the recommendations of the reviews outlined in the CIHR Management Response, released in May 2016, and has taken steps to implement the reports’ recommendations and established new governance committees to monitor scope and timelines for the projects. These two reviews allowed management to receive feedback as the new managerial and business systems were being implemented which allowed for course corrections. To date, CIHR has implemented most of the recommendations and continues to undertake the necessary steps to address project management challenges.

As of mid-2018, a total of four FGP competitions had been launched (in 2014, 2015, 2016 and 2017) the first two of which were labelled as “live-pilots”. The Project Grant Program competition has also had four launches since 2016, the first of which was labelled a “live-pilot”. The competitions were held in Spring 2016, Fall 2016, Fall 2017 and Spring 2018 with no competition launched in Spring 2017. Several enhancements were made to the 2015 FGP “live pilot” competition, informed by the 2014 “live pilot” and related survey data from reviewers, applicants, and Competition Chairs. These enhancements included clarifying adjudication and application criteria and guidelines, limit increases and additions to sections of the Foundation CV and Stage 2 application, changes in sub-criteria weighting, additional reviewer training, and the exploration of benefits and operational requirements to introducing synchronous reviews.

The implementation of the reforms was met with mixed reactions by Canada's health research community and on July 13th, 2016, at the request of the Minister of Health, CIHR hosted a Working Meeting with members of the community. The purpose of this meeting was to review and jointly address concerns raised regarding the peer review processes, particularly associated with the Project Grant Program. The key outcomes of that meeting and requested changes to the 2016 Project Grant competition included the appropriate review of Indigenous applications; adjustments to the number of applications permitted and page limits; adjustments to the Stage 1:Triage (e.g., number of reviews, elimination of asynchronous online discussion, elimination of alpha scoring system) and Stage 2: Face to Face Discussion (e.g., inclusion of highly ranked application and those with large scoring discrepancies, return to face to face panels for 40% of application reviewed at Stage 1); as well as the establishment of a Peer Review Working Group.

The Peer Review Working Group, under the leadership of Dr. Paul Kubes, was established as an outcome of the July 13th, 2016, Working Meeting. The Peer Review Working Group discussed each of the outcomes and made recommendations for action, including: revised eligibility and adjudication criteria; revised roles for Competition Chairs and Scientific Officers; the removal of asynchronous online discussion from Stage 1; reversion to a numeric scoring system; Stage 2 face-to-face-reviews; reviewer training; and, the launch of the pilot observer program in peer review for early career researchers.

In September 2016, CIHR announced an International Peer Review Expert Panel (PREP) to examine the design and adjudication processes of its IIR programs in relation to the CIHR mandate, the changing health sciences landscape, international funding agency practices, and the available literature on peer review. The review was in line with the mandated five yearly cycle of international review of CIHR, but the timing was brought forward due in part to the stakeholder reaction to the implementation of the reforms. The Panel was supported by the Director General, Performance and Accountability Branch and members of the OSP evaluation team given the need for independence of the panel as well as the direct relevance to the panel’s work to the planned OSP evaluation. The Panel presented its report in February 2017 in which it observed that although the basic design objectives and intent of the reforms were appropriate, sound, and evidence-based, there had been implementation failures. These implementation failures included the failure to: effectively pilot the applicant-to-reviewer matching algorithm (which is no longer being used); have the College of Reviewers in place at the outset of the reforms; effectively engage the research community throughout the reforms; and, to maintain the trust and confidence of CIHR's main stakeholders, the research community and Canadians, as represented through politicians. The Panel noted that the implementation challenges coupled with the series of rapid simultaneous changes that CIHR had made to some of its funding programs and Institutes, and the persistent constraint of flat-lined funding for investigator-initiated research in Canada, had resulted in a loss of trust from the CIHR research community and its stakeholders.

In July 2017, CIHR announced that the inclusion of early career investigators (ECIs) in the FGP was inconsistent with the vision of the program and that ECIs would no longer be eligible to apply for a Foundation Grant starting with the 2017-2018 competition. At the same time, in response to feedback from its key constituents and in the context of available funding, CIHR announced a realignment of its funding strategy for the two programs, redirecting $75M from the Foundation Grant envelope of $200M to the Project Grant envelope. Then in November 2017, CIHR struck a Foundation Grant Program Review Committee chaired by Dr. Terry Snutch of the University of British Columbia, to provide recommendations on the program’s objectives, design, application, and peer review processes, presented to CIHR’s Governing Council in November 2018. The recommendations included continuing to support the Foundation Grant Program with modifications including continuing to target mid-senior career researchers; having a single stage/application/review process that is conducted face-to-face; the FGP should represent 25% of CIHR’s IIR budget, that Foundation Grant Program applicants should not be eligible to apply for the Project Grant Program  at the same time, and to track data related to pillar gender and visible minorities to address any biases that may arise.

On April 15, 2019, CIHR announced that the Foundation Grant Program would be sunset, and the 2018-19 competition would be the last. The decision was based on a number of consultations (including CIHR’s Scientific Directors, Science Council, and Governing Council), the input of the Foundation Grant Program Review committee (struck in 2017, who recommended significant modifications to the program while preserving it), and a critical review of data including preliminary findings from this evaluation. The review of the data highlighted unintended consequences in the funding distribution within the program (e.g., funding a disproportionate number of applicants who were older/more senior, from larger institutions, and who were conducting Pillar 1 research, as well as inequity for female applicants in Stage 1) that were deemed unacceptable. In addition, CIHR acknowledged that the peer review process did not align with the renewed commitment to face-to-face review and had not reduced reviewer burden as originally envisioned.

Since 2019, there have been a number of key changes at CIHR, notably the development and implementation of  CIHR’s 2021-2031 Strategic Plan, which has resulted in a number of actions related to advancing research excellence, building health research capacity, and integrating evidence in health decisions. In addition, CIHR’s Equity Strategy outlines actions to foster equity, diversity and inclusion in the research system.

Program Objectives

The focus in the current evaluation is on the Operating Support Program (OSP), which during the period under review represents CIHR investments in the OOGP, and its successor programs, the FGP and PGP. The OSP is a sub-program of CIHR’s broader Investigator Initiated ProgramFootnote ii. The OSP aims to contribute to a sustainable Canadian health research enterprise by supporting world-class researchers in the conduct of research and its translation across the full spectrum of health.

Open Operating Grant Program

As indicated above, the OOGP was CIHR’s primary mechanism through which investigator-initiated health research was supported, starting prior to the creation of CIHR in the year 2000 (when it was the Medical Research Council of Canada) up until 2014-15. The specific objectives of the OOGP were to:

As a result of the reforms, two programs were created: the FGP, to provide long-term support for the pursuit of innovative, high-impact programs of research; and, the PGP, to support projects with a specific purpose and defined endpoint. The objectives of the OOGP were to be encompassed in the objectives of the new programs.

The total number of OOGP grants awarded in competition years between 2000-01 and 2015-2016 was 13,331 across all four pillars (Pillar 1 - 72%, Pillar 2 - 12%, Pillar 3 – 6%, Pillar 4 8.5%). Almost three-quarters were awarded to male NPIs (72%) and just over one quarter (28%) were awarded to female NPIs.

Foundation Grant Program

The FGP (one competition per year) was designed to contribute to a sustainable foundation of health research leaders by providing long-term support for the pursuit of innovative, high-impact programs of research.

The program is expected to:

Project Grant Program

The PGP (two competitions per year) was designed to capture ideas with the greatest potential for important advances in health-related knowledge, the health care system, and/or health outcomes, by supporting projects with a specific purpose and defined endpointFootnote iii.

The Project Grant Program is expected to:Footnote iv

Program Expenditures, Application and Success Rates

For the period from 2011-12 to 2017-18, grant expenditures for the OSP comprised OOGP grants until the introduction of FGP grants in 2015-16 and PGP grants in 2016-17, although any OOGP grants still in progress continue to have expenditures. Grant funds have increased steadily from $434M in 2011-12 to $539M in 2017-18 (Table 1: Operating Support Program Expenditures (2011-12 to 2017-18) in Millions). It should be noted that over the time of the evaluation, the majority of grants have been funded through the OOGP (82.5%), followed by the FGP (10.6%) and the PGP (6.9%). OSP has accounted for between 48% and 52% of CIHR’s total annual grants and awards expenditures between 2011-12 and 2017-18. OOGP accounted for 43-54% between 2000-01 and 2010-11 as noted in the previous evaluation.

The application pressure for the OOGP was relatively consistent until 2012-13, increasing for its last fiscal year before the launch of the new programs. Success rates for the OOGP started decreasing in 2009-10 and continued to do, likely due to the flat-lining budget between 2008 and 2013, as noted by the PREP. Applications to the PGP competitions far outnumber those to the FGP competitions and the success rates for the former appear to be slightly higher than for the latter (Figure B: Application pressure and success rates across OOGP, Foundation and Project Grant programs, 2006-07 to 2017-18).

Description of Evaluation

The evaluation of CIHR’s OSP, for the period from 2011-12 to 2017-18, focused on the performance of the former OOGP as well as the relevance and design and delivery of the FGP and PGP, which replaced the OOGP. In 2011-2012, the Evaluation Unit at CIHR conducted an evaluation of the relevance and performance of the Open Operating Grant Program (OOGP) from 2000-01 to 2010-11.

Evaluation Scope and Objectives

CIHR’s evaluation of its OSP, included in CIHR’s 2017-18 Evaluation Plan, was designed to meet the Tri-Agencies’ requirements to the Treasury Board of Canada Secretariat (TBS) under the Policy on Results (2016) by addressing the core issues of performance, efficiency, relevance, and design and delivery. In addition, the evaluation was intended to provide senior management with independent, objective, and actionable evidence about the impacts of research funded through the OOGP and the effectiveness of the design and delivery of the FGP and PGP. Findings from the evaluation have informed discussions and decisions about the OSP, particularly the FGP, throughout the evaluation.

The evaluation covers the OOGP, FGP, and PGP as part of the OSP from 2011-12 to 2017-18. Similar to the approach taken in the 2012 evaluation of the OOGP, the scope excludes funding through priority-driven mechanisms. The evaluation meets the requirements of the Treasury Board of Canada’s Secretariat (TBS) under the Policy on Results and the Financial Administration Act.

It is important to note that the evaluation was completed in fiscal year 2019-20, with the approval and web posting of this report as well as the development of the management action plan delayed due to the Covid-19 pandemic. It should also be acknowledged that a number of important  changes have taken place at CIHR since the completion of this report, most notably the implementation of the CIHR’s 2021-2031 Strategic Plan which has resulted in a number of key actions related to advancing research excellence, building health research capacity, and integrating evidence in health decisions.

Previous Evaluation of the OOGP

The OOGP was evaluated previously in 2012. Broadly, the evaluation found that the program met key objectives, contributed to the creation and dissemination of health-related knowledge, and supported high quality research. Considering the forthcoming program and peer review changes with the reforms, recommendations included:

  1. ensuring future open program designs utilized peer reviewer and applicant time efficiently;
  2. that designs account for varying application, peer review and renewal behaviours across Pillars;
  3. that further analyses are conducted on changes to the peer review system to fully understand potential impacts; and
  4. creating measures of success for future open programs, ensuring they are relevant for CIHR’s different health research communities.

The Management Response Action Plan (MRAP), included in the report, showed agreement with all recommendations, most of which were addressed throughout the implementation of the reforms.

Factors Affecting the Current Evaluation

Several factors had an impact on the current evaluation. Most importantly, there was a major program shift from the OOGP (which was CIHR’s and previously the Medical Research Council’s primary mechanism for supporting investigator-initiated research for decades) to the FGP and PGP. In 2009, CIHR launched a five-year strategic plan, Health Research Roadmap, introducing a vision to reform the peer review and open funding programs. Beginning in 2010, CIHR started the process of reforming its IIR programs, including the OOGP, and the related peer review processes, to address the full scope of its mandate while also reducing the burden on peer reviewers and applicants. CIHR intended the implementation of the changes to be iterative, drawing on consultations with stakeholders, pilot studies, and its own internal reviews. The new suite of programs, which replaced the OOGP, were piloted and implemented over the 2010-2017 periodFootnote v. However, the reforms, which took place in the context of constrained funding and experienced implementation challenges, were met with mixed reactions from the health research community.  CIHR’s strategic plan was also updated during the implementation of the reforms (Health Research Roadmap II – 2014-15 to 2018-19). Despite the evolving nature of these programs, the evaluation proceeded in order to meet TBS requirements under the Policy on Results. In 2016, CIHR announced the PREP, which was supported directly by the OSP evaluation team. Then in 2019, CIHR began the process to develop its next strategic plan and made the decision to sunset the FGP. Additional details are provided below.

Evaluation Questions

The following evaluation questions were developed to support the evaluation objectives and were informed by consultations with the Executive Management Committee (in its capacity as CIHR’s Performance Measurement and Evaluation Committee), various Directors General from CIHR Branches (Finance; Research, Knowledge Translation and Ethics; Performance and Accountability), and the Subcommittee on Implementation and Oversight (SCIO).

Relevance: Operating Support Program

  1. How does the Operating Support Program align with CIHR and the Government of Canada’s roles and responsibilities in investigator-initiated health research?

Performance: Impact Assessment of OOGPFootnote vi

  1. What are the outcomes and impacts of CIHR investments in the Operating Support Program through the OOGP related to:
    1. Advancing health-related knowledge through the production and use of research
    2. Building Canadian health research capacity
    3. Informing decision making through the dissemination of health-related knowledge generated from OOGP supported research
    4. Health impacts generated from OOGP supported research
    5. Broad socio-economic impacts generated from OOGP supported research

Design and Delivery: Implementation of Foundation and Project Grant Programs, Costing Associated with the Operating Support Program

  1. Have the programs been designed and delivered to achieve expected outcomes?
  2. Is CIHR’s Operating Support Program being delivered in a cost-efficient manner?

Methodology

Consistent with TBS guidelines and recognized best practices in evaluation, a range of methods and data sources were used to triangulate the evaluation findings (document and data review; end of grant reports, n = 3304; case studies, n = 8; bibliometric analyses).

The design for the current evaluation was developed in response to the information needs of senior management at a particular point in time, given the stage of implementation of the new suite of programs. It was conducted at a time when additional monitoring and consultative activities were taking place, including the collection of data to inform program implementation. Given the amount of data being collected for the implementation of the new suite of programs, the extensive consultations conducted during the reforms, and decisions by senior management at CIHR, the design specifically incorporated the majority of these activities as lines of evidence (e.g., PREP), while purposefully limiting the use of additional primary data collection measures with the research community (e.g., interviews, surveys).

Given the early stages and continued modification of the FGP and PGP, the focus, for the assessment of performance, was on the OOGP. Specifically, the evaluation examined the outcomes and impacts of OOGP related to advancing knowledge, building Canadian health research capacity, informing decision-making, health impacts, and broad socio-economic impacts. The creation, dissemination, knowledge translation, and use of health-related knowledge were specific objectives of the OOGP. Similarly, the creation and use of health-related knowledge is an objective for both the FGP and PGP. Therefore, lessons learned from the performance assessment of OOGP were expected to be applicable to the successor programs. Furthermore, the Canadian Academy of Health Sciences (CAHS) Impact Framework [ PDF (2.4 MB) - external link ] (CAHS, 2009) was used to guide the analysis of outcomes and impacts from the OOGP. Data from end of grant reports were disaggregated by pillar, sex, career stage, and language, with comparative analyses undertaken when sample sizes were large enough. Similarly, when possible and appropriate, comparisons were made to findings from the Evaluation of OOGP (2012). Additional methodological details can be found in Appendix B - Methodology.

Limitations

The following limitations should be noted:

Evaluation Findings

Relevance: Continued Need for OSP and Alignment with CIHR Act

Key Findings

The OSP was closely aligned with CIHR’s role, responsibilities, and priorities

The program contributes to:

CIHR's mandate, is "to excel, according to internationally accepted standards of scientific excellence, in the creation of new knowledge and its translation into improved health for Canadians, more effective health services and products and a strengthened Canadian health care system". CIHR’s vision is to position Canada as a world leader in the creation and use of health knowledge that benefits Canadians and the global community.

The OSP contributes to the achievement of this overarching mandate and vision by attracting and funding health research excellence since 2000 (through the OOGP, FGP, and PGP), and outperforms benchmark comparators (e.g., health researchers in OECD countries). In addition, the OOGP has facilitated the creation, dissemination and use of health-related knowledge (more so within academia), as well as the development and maintenance of Canadian health research capacity by supporting original, high-quality projects proposed and conducted by individual researchers or groups of researchers in all areas of health research, as evidenced in the current evaluation. Relevance was also strongly determined in CIHR’s 2012 Evaluation of the Open Operating Grant Program [ PDF (3.3 MB) ].

Given that many of the OOGP objectives are encompassed within the objectives of the FGP and PGP, similar results would be expected from these two programs as were observed from the OOGP. However, there are some differences in the structure of these two new programs. Broadly, the objectives of both the FGP and PGP together are generally aligned with the CIHR Act and mandate; however, the FGP did not update its objectives to reflect the ineligibility of early career researchers and the PGP does not explicitly mention the development and maintenance of Canadian health research capacity in its objectives. Although it is expected that highly qualified personnel (HQP) would be involved in and trained through PGP grants, given that health research training is a priority for CIHR, this role should be explicitly recognized in the program’s objectives. In addition, it should be noted that the focus in the current evaluation was on the design and delivery of the new programs and therefore no assessment of progress towards objectives has been undertaken. This should be a focus of future evaluations of IIR research in general and the PGP in particular.

The OSP addressed a continued need and was aligned with Government of Canada priorities

The continued need for the investigator-initiated research, and its alignment with Government of Canada priorities, is reinforced by:

The value of IIR research is affirmed in Canada's Science Vision. Specific objectives include making science more collaborative through increased support for research through the granting councils and other support for research and research infrastructure, fostering the next generation of scientists, and promoting equity and diversity in research.

In addition, the Fundamental Science Review [ PDF (7.8 MB) - external link ](led by Dr. Naylor) recommended that the Government of Canada should rapidly increase its investment in independent investigator-led research to redress the imbalance caused by differential investments favouring priority-driven research over the past decade.

The objectives and recommendations described above were reflected in Federal Budget (2018). Budget 2018 affirmed the government's commitment to supporting research and the next generation of scientists with a historic investment of nearly $4 billion over five years, with nearly $1.7 billion going to the granting councils to increase support and training opportunities for researchers, students, and other HQP. Budget 2019 builds on these investments in research excellence in Canada through additional investments in science, research and technology organizations and establishing a new Strategic Science Fund.

The OSP aligns with these priorities given that it provides grant funding to researchers to conduct research in any area related to health aimed at the discovery and application of knowledge. More specifically, its programs (OOGP, FGP, PGP) aim to successfully facilitate the creation, dissemination and use of health-related knowledge, as well as the development and maintenance of Canadian health research capacity by supporting original, high quality, and innovative projects proposed and conducted by individual researchers or groups of researchers in all areas of health research. The PGP also aims to promote collaboration across disciplines, professions and sectors.

Performance: Impact Assessment of the OOGP

Key Findings

OOGP funded research has resulted in advances in knowledge

The outcomes and impacts of investments in the OOGP in terms of advancing health-related knowledge through the production and use of research was examined through an analysis of end of grant reports, case studies, and bibliometric analyses. Recall that the CAHS Impact Framework was used to guide the analysis of outcomes and impacts from the OOGP. CAHS defines advancing knowledge as new discoveries and breakthroughs from health research, and contributions to the scientific literature. It includes measures of research quality, research activity (volume), outreach to other researchers, and structural measures (the research fields in which an organization is active and how it balances its portfolio of different research fields).

The best researchers were selected for OOGP funding

Bibliometric analysis is one frequently used approach to measuring knowledge creation, and is seen as an objective, reliable and cost-effective way to measure peer-reviewed research outputs (Campbell et al., 2010). Academic papers published in widely circulated journals facilitate access to the latest scientific discoveries and advances and are seen as some of the most tangible outcomes of academic research (Godin, 2012; Larivière et al., 2006; Moed, 2012; Thonon et al., 2015). More specifically, bibliometric analysis employs quantitative analysis to measure patterns of scientific publication and citation, typically focusing on journal papers, to assess the impact of research (Ismail, Farrands, & Wooding, 2009). In this evaluation, the Average of Relative Citations (ARC) and the Average Relative Impact Factor (ARIF) are used as measures of scientific impactFootnote vii.

The limitations of bibliometric analysis include difficulties estimating publication quality based on citations, differences in citation practices across disciplines and sometimes between sub-fields in the same discipline, as well as the difficulty moving beyond contribution to attribution (Ismail, Nason, Marjanovic, & Grant, 2012). CIHR has also recently signed the San Francisco Declaration on Research Assessment (DORA), which recognizes the need to improve the ways in which the outputs of scholarly research are assessed. Consistent with best practice, the triangulation of bibliometric analysis with other lines of evidence are also used in this evaluation to assess knowledge creation as a result of the program. Analysis of end of grant report information as well as case studies were used to assess highly impactful research resulting from OOGP funding. It should also be noted that the bibliometric analyses in this report are based on data for publications produced by OOGP researchers during the time they are supported by these grants. While this method is commonly accepted based on an assumption that these grants are a significant contribution to research output (e.g., Campbell et al, 2010; Ebadi & Schiffauerova, 2015), a direct attribution between grant and publication bibliometric data cannot be made.

An analysis of the journal publications and associated citations (up to and including 2016) of a sample of funded (n = 2,500) and unfunded (n = 500) applicants from OOGP competitions from 2000 to 2014 was undertaken. Overall, results demonstrated that the selection process used in OOGP competitions allows for the selection of the best applicants. Funded applicants produced, on average, 0.6 more papers with greater impact. Specifically, both the ARC and ARIF scores for funded applicants were higher than for unfunded applicants during the two years prior to the competitions (1.60 vs. 1.34 and 1.39 vs. 1.21, respectively). However, it should be noted that these differences, while statistically significant, are small.

Researchers were more productive when supported by OOGP funding

For the studied period (2001-2015), funded applicants published, on average, slightly more than four papers annually compared to 2.6 when they were not funded. The ARIF of supported papersFootnote viii improved with time and the ARIF value of supported papers for the 2000-2016 period (1.39) is above that of unsupported papers (1.30). However, despite a statistically significant difference (p < 0.001), supported publications do not have a practically greater ARC score (1.57) than the unsupported ones (1.55). The ARC in the current evaluation is slightly higher than the ARC observed in the previous evaluation, which increased significantly between 2001-2005 and 2006-2009 (1.44 vs. 1.54, p < 0.001).

In addition, post OOGP funding, funded applicants were also more productive (with an average of 0.6 more papers than unfunded applicants) and had greater impact – publishing papers that are getting more citations and publishing in journals that are cited more often than unfunded applicants. Funded applicants had greater ARC and ARIF scores than those who were unfunded (1.61 vs. 1.35 and 1.40 vs. 1.21, respectively); although statistically significant, this difference is relatively small. There is a global improvement in the scientific performance of funded researchers over time. However, the gap between funded and unfunded applicants widens over the period since papers stemming from funded applicants increase their scientific impact (ARIF and ARC) more markedly than the papers stemming from unfunded applicants.

Overall, OOGP funding is positively correlated with scientific productivity and impact. Recall that the data shows that supported researchers were more productive and had better impact scores than applicants who were unfunded in OOGP competitions. In addition, this study also demonstrated that the duration of OOGP funding is also positively correlated with productivity. As one might expect, on average, senior researchers (more than 10 years of experience) produce more papers than early (5 years and less of experience) and mid-career researchers (between 6 and 10 years of experience). However, in terms of impact, there is no difference in ARIF scores across career stages while early career researchers have higher ARC scores compared to mid and senior-career researchers. This finding is likely due in part to the fact that the ARC in the current study includes self-citations and it supports the assertion that early career researchers tend to cite newer and younger literature (Gingras, Larivière, Macaluso, & Robitaille, 2008).

Subgroup analysis by sex shows that male researchers were slightly more productive than female researchers (average of 0.7 more papers annually) and had greater impact scores (ARC = 1.62 vs. 1.48; ARIF = 1.41 vs. 1.34, respectively). This finding is consistent with the literature showing that men tend to have greater bibliometric research productivity and impact outcomes compared to women (Larivière, Ni, Gingras, Cronin, & Sugimoto, 2013).

Subgroup analysis by preferred language shows that among researchers whose preferred language is French, only 2.16% of their publications are in French. It should be noted that almost all papers indexed in the Clarivate Analytics Web of Science database used for the bibliometric analyses are in English and therefore language results should be interpreted with caution.

OOGP-funded researchers had greater scientific impact than other health researchers

OOGP-funded researchers not only show better impact in the health sciences than the Canadian average (ARC = 1.61 vs. 1.34; ARIF = 1.39 vs. 1.18, respectively), but also than the best performing OECD countries in this domain (ARC ranges from 1.20 to 1.57; ARIF ranges from 1.07 to 1.28). This finding is consistent with that found in the previous evaluation of the OOGP (CIHR, 2012).

OOGP research outputs increased since the previous evaluation

In addition to the results of the bibliometric analyses relating to research quality, end of grant reports include measures related to research activity such as the number of knowledge products produced as a result of an OOGP grant (i.e., journal publications, conference presentations, books/book chapters, and technical reports). End of grant data, provided via self-report from grant NPIs, is currently collected by CIHR for all operating grants, typically within 18 months of grant completion (see Appendix B - Methodology for additional details on end of grant reports and methods of analysis). These data should be interpreted with some caution given that they are self-report, represent 29% of grants awarded at that time, and the fact that simply producing these knowledge products gives no indication of quality, use, or translation. However, when considered alongside bibliometric analyses, this measure provides useful data on the outputs that result from this investigator-initiated program, as well as some insight into the publishing behaviours of the different parts of CIHR's health research community funded through the OOGP.

The most reported knowledge products produced from grants are published journal articles and conference presentations. In general, and as expected, the production of other types of knowledge products was quite low (e.g., 40% of grants produced books/book chapters, 10% produced technical reports). The range in number of each type of scientific output varied considerably across all types (Table 2: Knowledge Products, Grant Duration and Amount by Pillar).

Almost all (95%) OOGP-funded researchers publish journal articles, with an average number of 10.62 per grant. This is an increase from the average of 7.6 papers per grant reported in the previous evaluation (CIHR, 2012). It is not entirely clear why the number of articles has increased; however, it may be attributable to an overall increase in journal productivity observed globally (Bornmann & Mutz, 2015; Monroy & Diaz, 2018). Additionally, the majority of OOGP grants resulted in invited conference presentations (88%, average of 13 per grant), and over half resulted in other presentations (59%).

Pillar 1 grants and grants with male NPIs produced more journal articles and conference presentations

Additional analyses show that there are significant differences in journal article production and conference presentations (invited and all other) across pillars (p < .001). Pillar 1 grants produced more journal articles (M = 11.52) compared to all other pillars (Pillar 2 - M = 8.23, Pillar 3 - M = 5.63, Pillar 4 - M = 8.81, with no significant differences among Pillars 2-4; see Figure C: Average Number of Publications and Presentations by Pillar). It should be noted that the average number of publications increased for all pillars from the previous evaluation, by approximately 2-3 additional papers on average. Pillar 1 grants were also associated with a higher number of invited presentations than Pillars 3 and 4 (with no difference from Pillar 2). In terms of all other conference presentations, Pillar 3 was significantly lower than all other pillars.

Additional analyses show that there are significant differences in journal article production and the number of invited conference presentations across sex as well (p < 0.001), while there was no difference for all other conference presentations (p < 0.04). Grants with male NPIs produced a higher number of published journal articles and conference presentations than grants with female NPIs (see Figure D: Average Number of Publications and Presentations by Sex).

Unsurprisingly, there is an observable increase in the average number of published journal articles and invited presentations as the career stage of NPI’s on grants increases from early (M = 9, SD = 9), to mid (M = 10, SD = 11), to senior (M = 12, SD = 13). This finding, for publications, is consistent with the bibliometric results above. Overall, there were no observable differences in knowledge products based on the preferred language of the NPI.

Grant duration and amount were significant predictors of journal article productivity, followed by sex

Consistent with the previous evaluation, journal article production is moderately correlated with the amount (value) and duration of the grants awarded (r = 0.36, n = 3,134, p = 0.01 for both independent variables; see Table 2: Knowledge Products, Grant Duration and Amount by Pillar). Additionally, the amount and duration of grants are strongly correlated with each other longer grants tend to have more money (r = 0.67, n = 3,304, p < 0.01). Therefore, it seems as if the duration as well as amount of a grant has an important relationship with the number of publications produced.

Grant duration and amount differ across the four pillars: Biomedical researchers have the longest grant durations on average (4.3 years) compared with the other three pillars (3.4, 2.7, and 3.1 years for Pillars 2 through 4 respectively (see Table 2: Knowledge Products, Grant Duration and Amount by Pillar for additional data)Footnote ix. These differences are statistically significant (one-way ANOVA, p < 0.001), similar to the previous evaluation. Grant duration and amount also differ by sex: on average, male NPIs hold significantly longer grants (M = 4.1 years, SD = 13.22, n = 2,375, p < 0.001) compared to female NPIs (M = 3.7 years, SD = 13.42, n = 925). Similarly, male NPIs receive significantly larger amounts of funding than female NPIs (M = $527,436, SD = 268,967, n = 2,375; M = $452,985, SD = 291,672, n = 2,925, p < 0.001), representing an average gap of $74,451 (see Table 3: Knowledge Products, Grant Duration and Amount by Sex). Given these differences, further analyses were conducted to investigate whether the difference in published journal articles by pillar or sex could be attributable to an overlap with duration and amount, as opposed to other distinct differences among pillars or sex.

A moderated regression model was used to determine whether the predictive relationship of grant duration and amount on the number of published journal articles was influenced by pillar. The regression model confirmed that grant duration was found to be a strong predictor of the production of journal articles (p < 0.001), as was amount (p < 0.001). However, beyond this, pillar was neither a significant predictor nor moderator of articles published. Interestingly, although duration and amount are related (r = 0.66) (i.e., longer grants tend to have more money), differences in productivity are largely due to the duration of the grant, followed by amount (which accounts for some variance independent of duration), while pillar has a negligible effect. This suggests that the differences in productivity observed among pillars are largely a function of duration and/or amount rather than pillar itself.

Similarly, a moderated regression model showed that in addition to grant duration and amount, sex was also a significant predictor of the number of published journal articles (p = 0.04 for sex). In other words, male NPIs who have longer grants with more money are likely to produce a greater number of journal articles. Like the model above with pillar, sex was not as strong a predictor of productivity as grant duration or amount.

The significance of this analysis from an evaluation perspective is that it shows how assessing productivity by simply counting publications can be misleading. Given these findings, as well as requirements in the current Policy on Results, future evaluations and performance measurement of the OSP program should take into account variables that may have an influence on productivity. In addition, these findings should be considered in the future design and implementation of investigator-initiated programs. Areas for future research include understanding optimal grant durations and amounts (i.e., identifying when there are diminishing returns in productivity based on length of grant) with investigator-initiated funding where duration and amount are largely proposed by the researchers themselves.

Almost half of grants resulted in outcomes related to advancing knowledge beyond productivity

NPIs were asked about the production of a variety of other outcomes on their end of grant reports, some of which correspond with the CAHS category of advancing knowledge, such as: research method, theory, replication of research findings, and/or tool, technique, instrument or procedure. Specifically, they were asked whether the grant had resulted in an advanced or newly developed outcome, or whether it will result in a future outcome. Just under 50% of the grants (n = 3,304) resulted in these additional outcomes related to advancing knowledge. Almost half of the grants resulted in an advanced research method (44%), theory (49%) and/or replication of research findings (41%). Almost one third (30%) resulted in an advanced tool, technique, instrument or procedure.

NPIs indicated that advanced outcomes related to advancing knowledge were more likely to result from the grants right away (or within 18 months of grant completion, at the time they completed end of grant reports) rather than newly developed outcomes or outcomes that may occur in the future. As such, fewer newly developed outcomes had resulted from OOGP grants (9-30%, depending on specific outcome type) and an even smaller portion of NPIs indicated these potential outcomes may occur in the future as a result of the grant (7-12%). It is not entirely clear why there is such a wide range in responses; however, there are several outcomes related to advancing knowledge measured in the end of grant report, some of which just simply may not be as applicable for all (e.g., theory). Additionally, the end of grant reports include multiple measures of the same constructs (i.e., advanced, newly developed, may in the future) which may also contribute to increased variation in responses.

Biomedical grants as well as those with male NPIs accounted for more of the advancing knowledge outcomes compared to grants from the three other pillars. Given the low proportions here, further analyses by pillar and sex are not reported.

Over half of grants involved other researchers in end of grant knowledge translation activities

Another area identified in the CAHS framework relating to advancing knowledge is outreach to other researchers. End of grant reports asked NPIs to report on their engagement with, impact on, and involvement of a variety of stakeholders including other researchers and academics (i.e., those not formally listed on their grant applications). In the context of this evaluation, this is considered to be an indicator of outreach to other researchers.

The majority of NPIs (87%) reported that other researchers and academics were aware of the findings resulting from their grants. There were observable differences by pillar, with considerably fewer Pillar 3 and 4 grants indicating other researchers and academics were aware of their results (62% and 37%, respectively). This would suggest that NPIs on these grants tend to reach out to other researchers less frequently than NPIs on Pillar 1 and 4 grants.

NPIs were also asked the extent to which their grant had an impact on various stakeholder groups including other researchers/academics (those not included in the formal grant application). On average, NPIs felt their grant influenced other researchers to some extent (M = 3.37 out of 4, SD = 0.75, n = 3,303). There were no observable differences by pillar or sex.

Lastly, NPIs were also asked to indicate which phases of the research process related to their grant that other researcher/academics were involved in. This included overall involvement and involvement at every stage of the research process. Over half (57%, n = 1,891) of the total 3,304 grants were identified as having other researchers/academics involved in the research process overall. With respect to individual stages of the grant-related research process, the stage identified as having the highest involvement of other researchers/academics (of those grants identified as having involvement at some level) was End of Grant Knowledge Translation Activities (57%, n = 1,073), followed by Development of the Protocol (48%, n = 604). Additionally, of those who responded that other researchers/academics were involved (to some extent), 42% (n = 793) indicated that these stakeholders were involved in Data Collection/Project Implementation, with Interpretation of Results and Development of the Research Question cited as the stages with lowest involvement from other researchers/academics, at 36% (n = 671) and 35% (n= 669) respectively.

In addition to analyzing the end of grant report data, eight case studies of high impact grants were conducted (see Appendix B - Methodology for additional information on case selection and methods) providing additional insights into the outcomes and impacts of OOGP grants. Each case greatly advanced knowledge in its respective research area(s), which is to be expected given that high impact cases were selected. The researchers had successfully carried out their planned research programs, although with some adjustments that emerged from results along the way or from changes in collaborations. In general, the advances had proceeded in incremental steps through the execution of long-term research programs involving clusters of interrelated studies and analyses.

Case study researchers and trainees were highly productive and shared knowledge advances extensively. Researchers chose an important health research problem with great potential for health impact, which could therefore advance knowledge in a highly competitive domain. The research programs started from a long-term, high-level vision for the direction the research could take over a period of decades, and the NPI and team persisted in executing that vision. The research programs used longitudinal research designs and/or developed large, high-quality databases. Knowledge advances were shared via national and international collaborations, presentations, and publications, among other media.

Collaborations among investigators contributed to the intellectual development of the main research ideas. In the Pillar 1 (Biomedical) and 2 (Clinical) cases, collaborations between basic and clinical scientists were particularly impactful in advancing the research. The NPIs adopted and maintained high standards of methodological excellence in research, training, and choice of collaborators. Researchers stayed on top of developments at the forefront of the field, showing flexibility and adaptability to integrate emerging advances from elsewhere. Long term, stable funding from CIHR and many other sources aligned with the long-term vision and enabled its systematic execution.

OOGP Research Contributed to Building Health Research Capacity

Capacity development is a key objective in the OOGP as well as in the new suite of programs. It is also a key priority for CIHR as evidenced in its Act, strategic plan (Roadmap II, Strategic Direction I) and training strategy. CIHR supports capacity development directly through grants and awards for individual researchers and trainees, and indirectly, through providing funding for research projects that develop capacity through the involvement of students, trainees and other researchers/stakeholders.

The definition of capacity development used in the previous evaluation included the direct involvement in the research process of any paid or unpaid staff or trainee including: researchers; research assistants, research technicians; Postdoctoral fellows, post-health professional degree students (e.g., MD, BScN, DDS), Fellows (not pursuing a Master's or PhD), Doctoral, Master's, and undergraduate student trainees. The same approach is adopted in the current evaluation along with the application of the CAHS Impact Framework. CAHS defines capacity building as the development and enhancement of research skills in individuals and teams. It is measured across three subcategories: personnel, additional activity funding, and infrastructure required for research.

Pillar 1 researchers received the majority of OOGP grants

Pillar 1 researchers received the majority of OOGP grants (around 72% out of 13,331 from 2000-01 to 2015-16), a proportion that has been consistent since 2002-2003. The percentage dropped slightly from 80% in the previous evaluation (CIHR, 2012). Pillar 2 researchers received 12% of the OOGP grants, Pillar 4 received 8.5%, while Pillar 3 received 6%. While grants were funded across all four pillars, some barriers may yet be limiting greater representation from Pillar 3 and 4 researchers. A range of barriers and challenges were identified for researchers from these pillars applying to the OOGP in the previous evaluation relating to peer review, renewal applications and success, and cross-disciplinary projects (Thorngate, 2002; Tamblyn, 2011; Tamblyn et al., 2016).

OOGP grants are contributing to capacity building

Almost all OOGP grants involved research staff (95%; total of 16,347 research staff) and trainees (97%; total of 28,237 trainees), thereby contributing to capacity building in health research (Table 4: Research Staff and Trainees Involved in OOGP Grants). The average number of staff members and traineesFootnote x contributing to the research conducted per grant overall was 13.5, which increased from 8.61 in the previous evaluation (2012). This is expected given that student enrolment in Canadian Universities has been over 2 million since 2011/12 and has been steadily increasing at rates of 1-3% per year (StatsCan, 2017). Therefore, the increase in the average number of researchers and trainees involved in OOGP grants is likely due in part to this increase.

It is important to note that the number of researchers and trainees provided are estimates based on end of grant reports from a sample of 29% of OOGP grants funded between 2000 and 2013. Therefore, the total number of research staff and trainees presented here is an underestimation. Not all NPIs who received an OOGP grant completed an end of grant report (it did not become mandatory until 2011, and there is still not 100% compliance). However, the current sample (n = 3,304 out of 13,331) is representative in terms of demographic characteristics of the population. The reports in our sample were submitted between 2011-12 and 2016-17. Therefore, although the total number of HQP trained or supported over this period provides understanding about the number involved and trained through OOGP grants, averages may not be representative of the population of HQP, and total numbers are an underestimation of the population.

One-half to one-third of grants involved research staff

More than half of the sampled grants (out of 3,304) involved research staff (researchers: 56%; research assistants: 62%; research technicians: 63%), for a total of 16,347 (Table 4: Research Staff and Trainees Involved in OOGP Grants). Each grant involved more researchers (M = 4, both paid and unpaid) compared to research assistants (M = 3) and technicians (M = 2).

Two-thirds to three quarters of grants involved trainees

In terms of trainees, more than three quarters of the grants trained Doctoral students (78%), while approximately two-thirds trained Master’s students (68%), Postdoctoral fellows (66%), and undergraduate students (64%). Few trained post-health professional degree students (17%) or fellows not pursuing a Master’s or Doctoral degree (8%). A total of 28,237 trainees were trained by the current sample of 3,188 OOGP grants. Again, this is an estimated total for trainees associated with 29% of OOGP grants funded from 2000 until 2013 (Table 4: Research Staff and Trainees Involved in OOGP Grants).

Of the grants that had trainees, on average each grant involved more undergraduate students (approximately 5 per grant, both paid and unpaid) followed by Doctoral students (approximately 3). For all other trainee types, there is an average of approximately 2 per grant.

Pillar 1 grants involved more research staff and trainees

When looking across pillars, Biomedical (Pillar 1) grants are involving and training more research staff (e.g., researchers, research assistants, and technicians) and trainees than Pillar 2 through 4 grants (Figure E: Percentage of Grants Involving Research Staff and Trainees by Pillar). Pillar 1 grants accounted for 51% of the research staff in the sample as well as 79% of the trainees. This is not surprising given that a larger proportion of the sample was made up of Pillar 1 grants (73% of 3,304). The distribution of research staff and trainees matched that of the sample with more trainees than research staff involved in these grants. In contrast, Pillar 2 through 4 grants involved more research staff than trainees, at a rate that is more than double that of trainees. Although the total number of trainees was lower for grants from Pillar 2 through 4, the average number of trainees involved per grant stayed relatively consistent across pillar.

However, it should be noted that the average number of research staff was lower for Pillar 1 grants (only 1 per grant) compared to both the sample overall as well as all other pillars (Figure F: Average Number of Research Staff and Trainees per Grant by Pillar). In fact, the average number of research staff for Pillars 2 through 4 was much higher than the sample overall.  

Grants with male NPIs involved more research staff and trainees

When looking at HQP by the sex of the NPI, overall, grants with male NPIs are involving more HQP than grants with female NPIs (Figure G: Percentage of Grants Involving Research Staff and Trainees by Sex of NPI). Grants with male NPIs accounted for 66% of the research staff in the sample as well as 71% of the trainees. This is most likely because male NPIs made up 72% of our sample (out of 3,304); however, it is not clear this finding is due solely to sex. It is possible this finding may also be due to pillar, given that Pillar 1 grants also had a higher number of HQP, or a combination of sex and pillar. The distribution of research staff and trainees matched that of the sample, with more trainees than research staff involved regardless of NPI sex. Although the total number of trainees is lower for grants with female NPIs, the average number of trainees involved in each grant stayed relatively consistent across sex.

However, it should be noted that the average number of research staff was slightly higher for grants with female NPIs compared to both the sample overall and grants with male NPIs (Figure H: Average Number of Research Staff and Trainees per Grant by Sex of NPI). This is likely explained by the fact that there are more female NPIs on Pillar 3 and 4 grants.

Health research capacity was greatly strengthened in the cases studies included in the evaluation. NPIs’ research programs engaged a substantial number of trainees (between 4 and 15 trainees per initial OOGP grant) with subsequent increases in capacity beyond the initial grant. One case reported five PDFs, two PhD students and one Master’s student at the end of the grant funding period in 2012. Then, as of May 2018 it had involved a greater number of trainees, including 15 PDFs, five PhD students, 12 Master’s students, and 26 undergraduate students. NPIs paid careful attention to recruiting the best possible trainees, very often internationally. They aimed to produce researchers whose careers would go on to surpass their own in terms of research impact and prestige.

Funding for trainees came from a variety of sources, including Strategic Training Initiatives in Health Research (STIHR) grants, Canada Research Chairs, and co-funding arrangements with international governments and universities. The training environments provided were characterized by an emphasis on research excellence and methodological rigor, said to be ideal environments for incubating strong researchers. The research designs employed by many of the cases provided advantages for training, accelerating trainee development and early productivity. Trainees indicated that their experience in the initial OOGP grant and/or subsequent grants had contributed positively to their career development and success. Many trainees involved in the case study research programs have moved into successful academic and industry career paths. There was a pattern among exemplary cases of intergenerational research success: strong research capacity moulded and maintained over time in successive supervisor-trainee cohorts. Decision makers interviewed for one case study indicated that their participation had built their capacity to engage in research.

With respect to infrastructure, four out of eight case studies benefitted from significant infrastructure funding that supported their OOGP case grant research program. Some host institutions had accorded institutional priority and visibility to the research program in which the OOGP funding was embedded, which helped the NPIs acquire research facilities and equipment as part of institutional development. Key pieces of infrastructure developed in some case studies were databases that are recognized internationally as uniquely capable of contributing to knowledge advancement, because of their breath, scope, rigor, and/or long follow-up. In these case studies, having high quality infrastructure was seen as especially important in attracting trainees to fields that had stigma attached to the diseases that were the focus of the research (related to mental illness or addiction). This infrastructure afforded trainees the opportunity to learn cutting edge skills in a field that they might not have otherwise chosen.

This outcome is interdependent with the CAHS’s “advancing knowledge” outcome because in attracting the best possible trainees to be trained through their funded research, these NPIs helped to ensure that trainees advanced knowledge through their own intellectual contributions and enhanced collective productivity.

Informing Decision-Making Beyond Other Researchers and Study Stakeholders is Limited

OOGP grants are not only expected to create knowledge but also to contribute to the dissemination, commercialization/knowledge translation, and use of health-related knowledge. Similarly, objectives of the FGP and PGP also include the use of health-related knowledge. It is also a key priority for CIHR as evidenced by the strategic plan (Roadmap II, Strategic Direction I) in place during the period under review. CIHR’s Act also specifies the following objectives related to informing decision-making: (h) promoting the dissemination of knowledge and the application of health research to improve the health of Canadians; and (i) encouraging innovation, facilitating the commercialization of health research in Canada and promoting economic development through health research in Canada.

CAHS refers to informing decision-making as the impacts of research in the areas of science, public, clinical, and managerial decision‐making practice and policy. It is measured across four subcategories: health-related decision making, research-related decision-making, health products/industry decision making, and general public decision making. Knowledge translation, assessed in the previous evaluation, focused on commercialization as well as stakeholder involvement and impact (beyond researchers formally listed on the grant application) through case studies and end of grant report analysis.

Less than half of OOGP grants involve and impact stakeholders beyond other researchers and study stakeholders

Study stakeholders formally listed on the grant application (40% out of 3,304) and Health System Practitioners (34%) were the stakeholder groups most frequently involved in grants, although involvement overall across other stakeholder groups was low (3%-19%; consistent with the previous evaluation)Footnote xi. In the current evaluation, involvement included being part of one or more of the following: development of research questions/protocol, data collection/project implementation, interpretations of results, end of grant KT activities. Over one third of NPIs (35%) felt they had an impact on study stakeholders to a great extent, again consistent with the previous evaluation.

Focusing on grants where NPIs identified that stakeholders were involved in end of grant KT activities, those groups most frequently involved included the media (92%), consumer/charity groups (81%), and federal/provincial representatives (81%).

Pillar 2 through 4 grants are more likely to impact health practitioners

There were some observable differences across pillars; however, given the lower sample sizes, statistical comparisons were not undertaken. A greater proportion of Pillar 2, 3 and 4 grants involved stakeholders beyond those formally listed on the grant application compared to Pillar 1. More specifically, approximately three-quarters of Pillar 2 (71%) and 3 (79%) grants involved health system practitioners. Almost half of Pillar 2 grants (46%) also involved patients (46%), while more than half of Pillar 3 grants involved care managers (57%).

There were some observable differences across pillars; however, given the lower sample sizes, statistical comparisons were not undertaken. Specifically, Pillar 1 and Pillar 3 grants reported higher levels of stakeholder involvement in end of grant KT activities overall compared to Pillar 2 and 4 grants. Specifically, Pillar 1 and 3 grants had higher levels of reported stakeholder involvement in end of grant KT from media (89% and 97%, respectively), consumer groups/charitable organizations (82% for Pillar 1), community/municipal organizations (80% for Pillar 1), and federal/provincial representatives (74% for Pillar 3). Pillars 2 and 4 tended to involve health practitioners (Pillar 2 - 71%, Pillar 4 - 67%) and patients/consumers (Pillar 2 - 46%, Pillar 4 - 40%). There were no observable differences across sex, language or career stage.

OOGP funded research demonstrated limited translation of knowledge beyond academia

NPIs reported on whether the following outcomes (included in end of grant reports) related to informing decision-making had resulted from their grants: Findings Cited by Others; Policies, Guidelines, or Programs; Information or Guidance for Patients or Public; and Patents. NPIs indicated whether the outcome was advanced, newly developed, or whether it would result in the future.

Approximately half of the grants resulted in Findings Cited by Others (50% advanced, 17% newly developed out of 3,304), while very few grants resulted in Information or Guidance for Patients/Public (11%) and fewer resulted in Policies, Guidelines or Programs (8%) and Patents (7%). These findings are generally consistent with the previous evaluation although Patents were lower (11% vs. 7%) in the current evaluation. It should be noted that knowledge translation beyond academia can take some time after grant completion and the end of grant reports are completed within approximately 18 months of grant completion.

There were some observable differences by pillar. Specifically, Pillars 2 through 4 had more grants that resulted in Policies, Guidelines or Programs (17%, 27%, and 23%, respectively) and Information or Guidance for Patients or Public (26%, 23%, and 23%, respectively); while fewer grants across Pillars 2 through 4 resulted in Findings Cited by Others and Patents. There were no observable differences across sex, language or career stage, other than more female NPIs indicated Policies, Guidelines or Programs would result from their grants in the future (41% vs. 33% for male NPIs).

Additional analyses were undertaken for grants identifying an advanced outcome for - Information or Guidance for Patients or Public; Policies, Guidelines, or Programs; and Patents. Specifically, open-ended descriptions of outcomes provided were analysed (see Appendix B - Methodology for a description of the approach for the qualitative analysis). For Information or Guidance for Patients or Public, 60% (n = 80 out of a random sample of 134 grants) specified that impacts had been achieved (n = 70) or could be achieved in the future (n = 10). The majority of grants that had achieved impact provided Information (91%, n = 64), with fewer providing Guidance (21%, n = 15), and some providing both (19%, n = 13). The most frequently reported mechanisms through which Information or Guidance had been communicated were media (newspaper, radio, and television; 24%, n = 17), and presentations (21%, n = 15).

Among the grants that provided responses related to impact via Policies, Guidelines, or Programs, just over half (55%, n = 66 out of 120 random selected grants) specified an impact. Of these 66 grants, 78% (n = 52) indicated that impact had already occurred and 22% (n = 14) indicated that impact would occur in the future. Among those 52 grants specifying that impact in this area had already occurred, the most frequently reported pathway or mechanism was through Guidelines (30%, n = 17), with Policies (13%, n = 7) and Programs (11%, n = 6) reported less frequently. Almost all of the grants specifying an existing impact (98%, n = 51) fell into the CAHS impact framework subcategory of health-related impact, and only 2% (n = 1) fell into the subcategory of research impact. Of the 51 grants specifying impact in health, the majority (70%, n = 36) were related to health care, and a further fifth (20%, n = 10) were related to public health.

From the grants that provided responses describing advanced outcomes related to Patents, a random sample of 51 grants was analyzed. Of these sampled grants, almost half had either filed a Patent (46%, n = 19) or obtained a Patent (44%, n = 18). Less than 10% (n = 4) had both filed and obtained a Patent. Of the 41 cases that had at least one filed or obtained patent, a majority 59% (n = 24) were in multiple health areas. The most common health area for Patents was neuroscience and mental health (24%, n = 10), followed by genetics (17%, n = 7). A small proportion of grants explicitly stated a link with industry (12%, n = 5), or had U.S. (17%, n = 7) or international Patents (7%, n = 3).

Given the findings above about the infrequent involvement of stakeholders beyond those formally listed on the grant, it is not surprising that outcomes related to informing decision making and use of research results are limited. Knowledge translation and use is generally enhanced when end-users are involved in all aspects of the research process (Graham & Tetroe, 2007; Adily et al, 2009; Lomas, 2000); therefore, promoting or incentivizing the involvement of knowledge users will be beneficial in achieving longer-term impacts.

In the period studied, all case studies had influence on research decision making through their published findings. Case study research impacts on decision-making outside research were to date more limited than within research, with a subset of cases having achieved or being close to achieving impact. Among those who informed decision making beyond research, NPIs were able to position their research to either have informed or imminently inform decisions related to health system policy or program change(s), regulatory intervention, clinical practice guidelines, or patient behavior. Other cases had the potential to inform decision making in the future, although there was no clear pathway or timeline for how or when this would occur. Although the NPIs and other beneficiaries were aware of the potential, in four out the eight cases, they regarded the dissemination of research findings to have an impact on decision-making outside of research as beyond their scope and/or capacity.

None of the case study research programs have secured significant health products industry investment toward the commercialization of therapeutics or prevention products to date, although one is poised to. Two others, in Pillars 1 (Biomedical) and 2 (Clinical) were commercializing ancillary technologies.

Limited Health Impacts Resulted from OOGP Funded Research

CAHS refers to health impacts as health and health systems improvements, encompassing advances in prevention, diagnosis and treatment and palliation when related to research. Impacts are measured across three subcategories: health status, determinants of health, and health care system performance. A key objective in the OOGP and the new suite of programs is the use of health-related knowledge. It is also a key priority for CIHR as evidenced in both the strategic plan (Roadmap II, Strategic Direction I) and training strategy in place during the period under evaluation. CIHR’s Act also specifies the following objectives related to health impacts: (h) promoting the dissemination of knowledge and the application of health research to improve the health of Canadians; and (i) encouraging innovation, facilitating the commercialization of health research in Canada and promoting economic development through health research in Canada.

OOGP funded research results demonstrated limited longer-term health impacts

NPIs reported on (via end of grant reports) whether the following health impacts had resulted from their grants: Professional Practice, Patients’ or Public Behaviour(s), and Vaccines/Drugs. NPIs indicated whether the outcome was advanced, newly developed, or whether it would result in the future. Low levels of all types of relevant health impacts were reported by NPIs (6-12% Advanced and 2-4% Newly Developed outcomes, depending on specific outcome type). However, about one-third (29-37%) indicated that these outcomes may result in the future. Similar impacts from end of grant reports were not explored in the previous evaluation; instead, case studies provided insight on potential impacts. Similar to the range in findings presented for advancing knowledge above, it is not entirely clear why there is a range in responses; however, there are a number of outcomes related to health impacts measured in the end of grant report, some of which just simply may not be as applicable for all (e.g., vaccines/drugs). Additionally, the end of grant report includes multiple measures of the same constructs (i.e., advanced, newly developed, may in the future) which may also contribute to increased variation in responses.

Due to low sample sizes, statistical comparisons were not undertaken; however, some observable differences by pillar were found. Professional Practice (combining Newly Developed and Advanced forms) was reported more frequently for Pillars 2-4 (31-41%) than for Pillar 1 grants (8%); whereas Vaccines/Drugs were reported most frequently for Pillar 1 (11%) compared to all other pillars (4-6%).

Open-ended responses from a random selection of grants specifying advanced health impacts related to Professional Practice (n = 141), Vaccines or Drugs (n = 59), and Patient’s or Public Behaviour (n = 109) were analyzed.

The most frequently reported types of health impacts related to Professional Practice were clinical and professional practices (n = 68 out of 72 who specified an outcome). This was true both in grants that had identified an existing health impact, and in grants for which an impact was expected to occur in the future. Of those grants that had led to existing health impacts in clinical and professional practice, the majority (80% out of 51) were related to determinants of health (specifically modifiable risk factors) and more than half (63%, n = 28 out of 51) led to advances in prevention, diagnosis, and treatment (not mutually exclusive). The type of practitioners most frequently impacted, both by existing and potential future health impacts, were clinical practitioners (58%, n = 42 out of 72).

Among those referencing an impact related to Vaccines and/or Drugs (76% out of 59), only one had achieved a marketed Vaccine or Drug, and the most frequently identified type of impact was knowledge/understanding of a current drug (n = 18).

Very few NPI’s specified an impact related to Patients’ or Public Behaviour. Specifically, only six out of 109. Due to the low sample size, this indicator of health impact was not reported.

None of the case studies were close to producing population-level impact on health determinants or health status. Indeed, the case studies served to illustrate that the time horizon to achieve health impact can be expected to be very long. Overall, while the majority of cases have not yet improved population health or well-being, their potential health impacts at the population level are enormous. Direct contributions to likely health impacts were observed in two of the eight cases. In one case – a research program associated with Pillar 3 (Health systems/services) – interventions attributable to the case study research are becoming mainstreamed, with the help of substantive follow-on research funding from health jurisdictions. In the second – a research program associated with Pillar 2 (Clinical) – a pilot health service based on findings from basic clinical science was implemented.

A key success factor for achieving health impact, identified a priority and supported by the case study findings, was multiple, sequential sets of next-stage funders/partners’ availability and interest in supporting next-stage research and development. In all cases, if next stage funders or partners were available and interested, the OOGP grant findings could be developed further, and research results translated to potential products, processes, policies, and other applications, some with commercialization potential.

End of grant reports also included questions relating to broad health impacts covered by the CIHR’s mandate. Over half of researchers indicated their grants may contribute to the following mandate areas in the future: improving health for Canadians (63%); creating more effective health services and products (54%); and strengthening the Canadian healthcare system (52%).

Limited Socio-Economic Impacts Resulted from OOGP Funded Research

CAHS refers to socio-economic impacts as broad economic and social impacts that include benefits from the commercialization of research findings, the net benefit of improving health and well-being, and the social benefits arising from health research. A key objective in the OOGP and the new suite of programs is the commercialization and use of health-related knowledge. It is also a key priority for CIHR as evidenced in its strategic plan (Roadmap II, Strategic Direction 1) and training strategy. CIHR’s Act also specifies the following objectives related to health impacts: (h) promoting the dissemination of knowledge and the application of health research to improve the health of Canadians; and (i) encouraging innovation, facilitating the commercialization of health research in Canada and promoting economic development through health research in Canada.

OOGP grants are expected to contribute to the use and commercialization of health-related knowledge. NPIs reported on whether the followed outcomes related to commercialization and broad socio-economic impacts had resulted from their grants: software/databases, intellectual property claims, product licences, and spinoff companies. NPIs indicated whether the outcome was advanced, newly developed, or whether it would result in the future. The proportion of grants identified as producing any of these commercialization outputs, either advanced or newly developed, was very low (2%-8%; consistent with the previous evaluation). Furthermore, few (10-20%) felt such outcomes may result in the future.

The proportion of grants reporting direct cost savings, either advanced (4%) or newly developed (2%), was very low, consistent with the previous evaluation. However, close to one third of grants (30%) reportedly may produce direct cost savings in the future. The timespan to achieve health impacts conveyed in the literature is approximately 17 years (Grant, Green, & Mason, 2003; Wratschko, 2009) and therefore it would not be expected that these longer-term impacts would be observed at the time an end of grant report was completed by researchers; however, the case studies highlight that the timespan to achieve longer term impacts could be even longer.

None of the case studies attained broader socio-economic impacts, although all cases had some longer-term potential for an eventual major contribution to prosperity and wellbeing in Canada. Commercialization achieved to date is limited to research enablers, rather than marketed health interventions. Except for one case, potential for commercialization ranges in size and time horizon, but is generally quite distal and out of the scope of capacity and interests of the case study research teams. Broader impacts are likely in one case where interventions attributable to the case study research are becoming mainstreamed; however, the outcome will not be related to cost savings but to using health system resources more effectively.

Design and Delivery: Foundation and Project Grant Programs and Cost Efficiency of OSP

Key Findings

The Current Design of the OSP is Very Different than the Intended Design

The design and delivery aspects of the current evaluation focused on assessing whether the FGP and PGP were designed and delivered to achieve expected outcomes; whether they were on track to meet those objectives; and the programs’ capacity for data collection, management, and analysis to inform future decisions.

The FGP and PGP have undergone many changes since their launch, informed by a variety of sources internal and external to CIHR

Broadly, by design as well as through an incremental implementation process, both the FGP and the PGP programs have undergone many changes since their launch. Program changes were expected given CIHR’s intent of continuous improvement throughout the reforms. For the FGP, the changes have had more of an impact on program objectives, culminating in the program being sunset in 2019. The changes to the PGP have mainly been operational in nature, with little impact on program objectives (more details below). Although the majority of changes to the FGP and PGP programs have been informed by feedback from both internal and external sources (i.e., pilot studies, internal reviews and data analyses, research community input and research studies, the July 2016 working meeting with the Minister, the Peer Review Working Group, the Internal Audit Consulting Engagement on the reforms implementation, the Peer Review Expert Panel, and the FGP Review Committee), it is clear that the programs have not achieved all expected outcomes (at the current stage of implementation). However, the programs were designed in line with international standards and resulted in the funding of excellent research.

Operational challenges were identified during the 2014 and 2015 pilot competitions, which led to a series of early design changes. These challenges were recognized through survey feedback from reviewers, applicants and competition chairs. Recall, the pilots were conducted in a ‘live’ manner (i.e., piloted during routine program delivery across several programs). Initial design elements associated with the new programs included structured applications, remote/virtual review, a new rating scale, and a streamlined CV in 2013. Broadly, the changes included clarifying adjudication and application criteria and guidelines, limit increases and additions to sections of the Foundation CV and Stage 2 application, changes in sub-criteria weighting and budget justification, additional reviewer training, and the exploration of benefits and operational requirements to introducing synchronous reviews.

Further challenges were identified through the July 2016 working meeting with the Minister, stemming from concerns about the reforms from the research community. The recommendations from the meeting included additional operational changes (e.g., further clarification and improvements to the application and peer review processes), plus the acknowledgement that the challenges the reforms meant to address (e.g., reviewer and applicant burden) were not being alleviated.

The Peer Review Working Group, established based on a recommendation from the July 2016 working meeting, recommended additional changes to the design of the program, targeted at addressing the challenges associated with the original intent of the reforms. These changes included: further revision to the eligibility and adjudication criteria; the removal of asynchronous online discussion (despite earlier attempts to enhance the effectiveness of this approach); reversion to a numeric scoring system; reinstating Stage 2 face-to-face-reviews; additional reviewer training; launching the pilot observer program in peer review for early career researchers; limiting the number of applications per competition; encouraging grantees to agree to be reviewers if invited; and equalizing success rates for early career researchers in PGP competitions.

A major change to the funding allocation of the two programs, announced by CIHR in July 2017, was based on feedback from the research community and the findings of the PREP. The PREP viewed the allocation of 45% of the investigator-initiated budget to the FGP as “ambitious and too high at this stage and in the context of available funding." Initially, the proportion of CIHR’s investigator-initiated research budget was 45% and 55% for the FGP and PGP, respectively. These allocations were changed to 22% and 78%, reducing the total funding envelope for the FGP from $200M to $125M. The $75M reduction in the FGP was then to be redirected to the PGP.

The OSP selected and funded research excellence

A RAND Europe study, commissioned by CIHR for the PREP, showed that grant allocation and peer review principles and practices at CIHR were aligned with those of major international health research funders and that in some cases, CIHR appeared to be ahead. Additionally, bibliometric analyses were completed to inform the PREP, comparing researchers who received FGP and PGP grants to those who applied but were not funded. Similar to the OOGP, results indicated that these new programs are also attracting the highest caliber researchers, based on ARC and ARIF scores. Applicants, irrespective of funding status, outperform all researchers from Canada and, in most cases, those from other OECD countries. Furthermore, the peer review process is selecting the best from among the available applicants: funded applicants outperformed unfunded applicants across all bibliometric indices.

Despite strong design and planning, implementation challenges led to a failure to achieve some program objectives

Although evidence from both the Internal Audit review and the PREP indicated that while the reforms implementation benefitted from well-developed planning tools, there was implementation failure.

Specifically, CIHR’s Internal Audit review of the reform’s implementation acknowledged the benefits of well-developed planning tools, but they also raised concerns around governance, information-sharing, communications, reporting, project planning, and stakeholder engagement. The PREP concluded the design intent and logic of innovation regarding the open grant programs and the process of peer review was sound. However, poor implementation coupled with a resource-constrained health research funding environment and the introduction of many simultaneous changes at CIHR made the reforms problematic and led to a loss of trust by stakeholders. The PREP noted several implementation failures including the failure to: effectively pilot the applicant-to-reviewer matching algorithm (which is no longer being used – internal analyses were conducted by Funding Policy and Analytics (FPA), and Business Systems and Continuing Improvement for Project); have in place the College of Reviewers at the outset of the reforms, effectively engage the research community throughout the reforms; and, maintain the trust and confidence of CIHR's main stakeholders, the research community and Canadians, as represented through politicians.

More specifically, one of the challenges identified during the initial consultations was to reduce reviewer burden. The Panel observed that although CIHR did preliminary modelling of the implementation of the new grant system and its process components, delays and problems in implementation and subsequent "course corrections" resulted in program delivery not matching the modelling, partly because the assumptions underpinning the original design proved to be incorrect. Some applicants understandably applied to both programs in parallel. Thus, the cancelation of a funding round led to problems cascading beyond the parameters of the prior planning and created a situation where many more applications were received than expected. This, in turn, placed more burden on the reviewer allocation systems.

As indicated above, the PREP noted that the College of Reviewers, introduced as part of the reforms to address the challenges of lack of expertise availability and inconsistency of reviews, was not ready when the reforms were implemented. It was also noted that the delayed implementation of the College further compounded the technological challenges with reviewer matching and assignment.

The College is a member-focused resource designed to professionalize peer review, enhance review quality, and provide a more stable base of experienced reviewers for all funding competitions. The inaugural slate of College Chairs was announced in 2016, two years after the first FGP live pilot. The College began enrolment of members in June 2017. Once established, the College created several customized learning programs and webinars aimed at enhancing the quality of reviews (e.g., Conducting Quality Reviews, Unconscious Bias in Peer Review); developed an evidence-informed quality assurance framework to be enhanced and validated by the research community; and established stringent College membership criteria for different categories of membership. The College is working with Program Delivery to promote the use of College members, but reviewer recruitment for the PGP is subject to endorsement by competition chairs who may not necessarily choose to go with College members. The evidence as far as the FGP and PGP are concerned, has shown a strong uptake in the use of College members: 78% of reviewers for the Fall 2017 and Spring 2018 Project competitions and 99% of the 2017 Foundation Grant competition were College members.

Male researchers have higher success rates compared to female researchers

Success rates are higher for male researchers compared to female researchers for all OOGP competitions and the first two FGP competitions. This gap narrowed for the last FGP competition and all the PGP competitions (Figure I: OSP Success Rates by Sex). Across the evaluation period, the majority of OOGP researchers were male (72%; female: 28%). These proportions of male vs. female researchers were consistent across Pillar 1 and 2, while the proportions for Pillars 3 and 4 were relatively even, with slightly more female researchers holding these grants (Figure J: Proportion of OOGP Grants by Sex and Pillar).

Research shows sex and gender biases in funding decisions, specifically related to OOGP and FGP programs

There is evidence of sex and gender biases in funding decisions, both globally and more specifically within the OOGP and FGP programs at CIHR. Research shows that male researchers tend to secure more research funding than females, internationally, regardless of discipline (O’Witteman, Hendricks, Straus, & Tannenbaum, 2019). Data from the UK, US, Denmark, and the EU suggest that women have received less grant funding (European Research Council, 2017; Ginther, et al., 2011; National Institutes of Health, 2016; Pohlhaus, Jiang, Wagner, Schaffer, & Pinn, 2011) and held fewer large grants (McAlliser, Juillerat, & Hunter, 2016).

This research was unclear as to whether this difference was attributable to characteristics of the research or the researchers themselves. However, recent research by O’Witteman and colleagues (2019), looking at CIHR’s IIR competitions between 2011-2016, identified that gaps in funding related to sex of the researcher were not tied to the relative quality of the research proposals but were related to the caliber of the applicant themselves. Specifically, female principal investigators were being reviewed less favourably than male principal investigators.

More specifically, they found that overall, there were minimal differences between the success rates of male and female applicants (average of 15.8%) and negligible differences were observed between male and female applicants in PGP competitions (females 0.9% lower). However, in the FGP, in which the peer review is focused on the applicant’s calibre, females had a significantly lower success rate than males (4%).

A retrospective study of CIHR grant funding by Burns and colleagues (2019) also found that female researchers had significantly lower grant success in OOGP competitions than males, and female researchers consistently submitted fewer grant applications. This study also found sex differences in funding relative to research area. Applications submitted by female researchers were less likely to be funded by the Institutes of Cancer Research, Circulatory and Respiratory Health, Health Services and Policy Research, and Musculoskeletal Health and Arthritis; whereas female researchers’ applications to the Institute of Aboriginal People’s Health were more likely to be funded than those submitted by male researchers.

A recent perspective by Tannenbaum, Ellis, Eyssel, Zou, and Schlebinger (2019) is that policy changes related to sex and gender analysis have occurred at several major funding agencies, including CIHR (2010), the European Commission (2014), the National Institutes of Health (2016), and the German Research Foundation (2020). The authors suggested that standardized methods of sex and gender analysis need to be developed and implemented through a coordinated effort among researchers, funding agencies, peer-reviewed journals, and universities.

Early career researchers have lower success rates compared to mid- and senior career researchers

In general, early career investigators (ECIs) have lower success rates than established researchers - this was more pronounced in the first FGP competition (2014-15) but has evened out for PGP competitions (Figure K: OSP Success Rates by Career Stage). ECIs were no longer eligible to apply for the FGP as of 2017-18. Starting with the Fall 2016 PGP competition, success rates for ECIs in PGP competitions were equalized - that is, the proportion of ECIs funded would equal the proportion of ECI applicants to the competition. An additional amount of $30 million made available in Budget 2016 was to focus mainly on ECIs.

The average age of OOGP funded researchers was 47 years (SD = 8.57; range 27-82) and the majority of funded researchers were considered to be at the senior or mid-career levels when they received their OOGP grant (41% and 40%, respectively, with 19% considered early career researchers). When looking across pillars, this breakdown is consistent for Pillar 1, while for Pillars 2 through 4, the majority of NPIs are mid-career researchers (Figure L: Proportion of OOGP Grants by Career Stage and Pillar). Pillars 2 through 4 also have a higher proportion of NPIs that are early career researchers. The pillar with the highest proportion of early career researchers is Pillar 3.

English language applications generally more successful than French language applications

English language applications have generally had higher success rates than French language applications except for OOGP 2010-11 and FGP competition 2016-17 (Figure M: OSP Success Rates by Preferred Language). Due to the low volume of French language applications received and approved, these results need to be interpreted with caution. Almost all OOGP funded researchers included in the current evaluation (97%) indicated that English was their preferred language, with no observable differences in age or language across pillars.

The Foundation Grant Program was sunset

Recommendations from the FGP Review Committee were discussed at Science Council in November 2018. The committee’s mandate was to 1) provide recommendations on the FGP objectives, design, application and peer review processes to better position the program to advance CIHR’s strategic objectives; 2) guide continuous improvements that are responsive to the concerns and changing needs of the health research community; and, 3) provide guidance on funding levels and allocations to ensure that funding is both equitable and sustainable. Specific recommendations included single stage application/peer review; face to face review; consistent scoring with PGP; that CIHR allocate 25% of its IIR budget to FGP (with flexibility to adjust); tracking of EDI/GBA information to address any biases; and to continue to exclude early career researchers.

CIHR sunset the FGP in April 2019 based on a number of consultations, the input of the FGP Review Committee, and a critical review of administrative and competition data, as well as preliminary findings from this evaluation. The review of administrative and competition data highlighted unintended consequences in funding distribution within the program that were deemed unacceptable (e.g., funding a disproportionate number of applicants who were older, from larger institutions and who were conducting Pillar 1 research, as well as inequity for female applicants in Stage 1). CIHR indicated that adjustments to the program would not be sufficient. CIHR also acknowledged that the peer review process did not align with the renewed commitment to face-to-face review and had not reduced reviewer burden as originally envisioned. The Peer Review Working Group (2016) had also recommended equity across different career stages and sex of applicants. The Policy on Results reinforces the need for equity through its requirements for considerations of equity, diversity and inclusion including sex and gender. CIHR is committed to addressing any unconscious biases in its processes to ensure equitable access to research funding (e.g., Interagency Committee on Equity, Diversity, and Inclusion; Interagency Committee on Early Career Researchers, Tri-agency Statement on Equity, Diversity and Inclusion, Tri-agency Equity, Diversity and Inclusion Action Plan 2018-2025.

Review of PGP objectives is needed based on implementation and programmatic changes as well as lack of alignment with CIHR Act

The OSP Program has been operating in a space with continued programmatic changes, with several elements that have not been rolled out as intended (in terms of functionality and time – e.g., reviewer matching software, College of Reviewers), noted implementation challenges (Consulting Engagement; PREP), and the subsequent sunset of the FGP. At the same time, the OSP has been successful in the following ways: attracting and funding research excellence (OOGP-funded researchers and FGP and PGP applicants are more productive and impactful than health researchers in Canada and other OECD countries based on traditional bibliometric measures); facilitated the creation, dissemination, and use of health-related knowledge (mainly within academia); and contributed to the development and maintenance of Canadian health research capacity by supporting original, high quality projects proposed and conducted by individual researchers or groups of researchers in all areas of health research. The focus on the degree to which the OSP is on track to meet expected outcomes is based on the PGP given that it is the only program remaining.

The FGP and PGP were intended to operate in tandem, with the objectives of both programs broadly encompassing the objectives of the OOGP. Although there is overlap between the objectives of the PGP and the OOGP, they are not completely aligned. In addition, the objectives of the PGP are not fully aligned with the CIHR Act. Recall that the objectives of the PGP during the period under review were to:

Specific objectives from the OOGP missing from the PGP include supporting original and high-quality projects or teams/programs of research; and developing and maintaining Canadian health research capacity, including research training. Specific objectives from the FGP not covered by PGP objectives include supporting research leaders, innovative lines of inquiry, and the development and maintenance of Canadian capacity in research and other health-related fields. The objectives related to building capacity, which were included in both the OOGP and FGP, speak directly to sub-objective 4 (j) of the CIHR Act, which includes “building the capacity of the Canadian health research community through the development of researchers and the provision of sustained support for scientific careers in health research”.

Due to the overlap between the objectives of the PGP and OOGP, it is likely that similar results related to advances in knowledge creation and capacity building in health research will be achieved. However, it is currently unclear whether all objectives of the PGP will be met. Without the clear articulation of capacity building as an outcome, there may be challenges in its measurement as well as unclear expectations for the research community. Furthermore, the PGP includes the articulation of longer-term impacts and wider knowledge translation beyond academia (i.e., through the promotion of collaboration, commercialization and use of health-related knowledge), and the results related to the outcomes and impacts of the OOGP in the current evaluation show that knowledge translation beyond academia, as well as longer-term health and socio-economic impacts, are limited. Findings from the analysis of end of grant reports and case studies also support the need for clearer articulation of expected results in terms of longer-term outcomes and impacts (e.g., health impacts, broad socio-economic impacts) as not all researchers interpret them the same way based on their field of research; nor do all researchers consider these to be overt goals of their research (e.g., Pillar 1).

It should also be noted that findings from the previous evaluation, end of grant reports, and case studies support the need for a longer term (i.e., programmatic/longitudinal research) or renewable funding formula for health researchers. This is in line with the provision of sustained support described in the Act, as well as the objectives of the sunset FGP. Clarity is also needed on what is meant by “relevant collaborations”.

The adjudication criteria and eligibility requirements should also be reviewed to ensure alignment with objectives. For example, 75% of the adjudication criteria in the PGP is based on feasibility (50% related to approaches and methods; 25% related to expertise, experience and resources) and it is unclear how this facilitates ideas with the greatest potential for important advances in health across the four pillars and across stages (e.g., discovery to application, including commercialization). Similarly, it is unclear how the PGP promotes relevant collaborations across disciplines, professions, and sectors given that the eligibility criteria do not explicitly require collaborators to be listed on the grant application.

In sum, given the programmatic changes (from the intended to the current elements of the programs and the sunset of FGP), findings from the current and previous evaluation, and recommendations from the PREP and the FGP Working Group, the objectives of the PGP should be reviewed to ensure it includes all relevant aims and clearly aligns with the CIHR Act (e.g., capacity building).

Improvements in Data Collection Capacity are Needed

There is a lack of short-term performance data to inform programmatic decision-making

A large volume of different types of application and competition data is collected at CIHR by a variety of sources (e.g., Program Design and Delivery, Funding Policy and Analytics, Results and Impact Unit). While much of this data is necessary and useful for program monitoring, some of this data is not being analyzed or used in a systematic way. For example, Funding Policy and Analytics (FPA) collects data from applicants, peer reviewers, and committee chairs, at all stages of each FGP and PGP competition. Although the data collection tools have been streamlined somewhat from the live pilots, there is still a large amount of implementation data being collected that is currently not used and may be causing respondent burden.

The OSP would benefit from a clearly defined data collection strategy with defined objectives for continued data collection, given the large volume of data being collected as well as the multiple stakeholders involved. Clarity around roles and responsibilities for collection, management, and use of OSP data, as well as considerations about data quality and reliability and respondent burden are needed.

There are concerns about the availability and reliability of mid- to longer-term performance data with the current end of grant report

At the same time, there is currently a lack of output and outcome data being collected to assess progress on objectives beyond the end of grant report. Given that the end of grant report is not administered until after grant expiry, due to the stage of implementation of the FGP and PGP these data will not be available for some time. This was particularly problematic for the FGP given that the grants are seven years in length. Moving away from data collection focused on implementation (e.g., application and competition experiences) to annual reporting or periodic progress reporting would be beneficial for CIHR in assessing the interim performance of the new programs in line with the idea of continuous improvement and enabling more timely course corrections if needed. During the period under review, programmatic decisions were based solely on implementation and competition data, not performance or output/outcome data.

In the case of the FGP, the Program Design and Delivery (PDD) branch created a Case Management Tool, through extensive consultations with stakeholders, to collect interim information about outputs and outcomes from the NPIs of Foundation Grants. An annual report survey was developed that collected information about the grantee (demographic and employment information), their program of research and additional funding, barriers to achieving objectives, capacity building, stakeholder involvement, knowledge products related to the grant, and success stories. The PDD was in the process of launching the tool, but the resources dedicated for monitoring were reallocated to program delivery. Thus, the tool was not implemented during the evaluation period. A similar tool does not exist for the PGP; however, it would likely be beneficial in terms of complimenting the application experience data collected by FPA and tracking the outputs and outcomes of PGP grants.

The end of grant reports (data recorded in the Research Reporting System; RRS) are the main data source for outcome data as well as the indicators identified in the program’s Performance Measurement Strategy and now the IIR Program Information Profile (PIP) under the Departmental Results Framework (DRF). There are concerns about the availability and reliability of the RRS data from the current end of grant report (e.g., self-report; low response rates; variability in completion times; overall length, structure, and type of questions included) as well as concerns about the ability to quickly analyze and report on the data collected from it.

End of grant reports are self-report, and although this is a commonly used and accepted approach for data collection, it is prone to biases, potential recall issues, and issues with attribution and contribution of funding given that researchers often have multiple sources of funding and broader programs of research. The end of grant reports are very long which can lead to respondent fatigue. The length coupled with issues with the structure and content of the report pose additional concerns. There are inconsistent scales used across questions, the use of multiple constructs in a single question, a lack of mutual exclusivity in some response items, a lack of definition of terms, all leading to difficulties in interpreting and analyzing responses. The response rates are also very low (29% for the OOGP in the current evaluation). There is currently no known mechanism in place to ensure compliance and there are still many NPIs who do not submit an end of grant report, despite them becoming mandatory in 2011. For those that do complete them there is a wide range in time to completion, despite the guideline that they should be complete within 18 months of grant expiry. Lastly, historically the data collected by the end of grant reports have rarely been used outside of evaluations with no regular reporting on grant outcomes by PDD. However, the Results and Impact Unit (RIU), responsible for performance measurement, utilizes end of grant data for the relevant DRF indicators (approximately 11 indictors).

CIHR is making advances in data governance; however, challenges with data ownership and management further affect the ability to monitor and assess program performance

CIHR is currently centralizing their data function through FPA, with requests for all data coming in through this team; however, there is a still of lack of clarity as to who owns and manages different sources of data. FPA currently handles all internal and external data requests for administrative data. They also work with other relevant units to address requests beyond administrative and competition data, as some data is still collected/managed elsewhere (e.g., financial data, peer review data). For example, if Finance related data is needed, FPA works with Finance staff to procure it. Similarly, if competition data is needed (e.g., applications, peer review committee members), FPA may work directly with Program staff to obtain it. However, end of grant report data was owned by the RIU (at the time of the evaluation) and requests for this data needed to be made directly to them.

The variation in data ownership and management can create additional challenges with the availability, accuracy, and consistency of data. For example, OSP expenditure data provided separately from Finance and FPA for the current evaluation did not match, requiring extra time and resources for validation. The difference was less than $1M between 2011-12 and 2013-14 but the differences after 2013-14 were greater (ranging between $5 and $11M). Overall, the variation represented approximately 1% of overall expenditures and likely resulted from differences in definitions, underlying assumptions, and the types of exclusions that were taken into consideration.

CIHR’s data governance program was put in place in 2017 with the aim of enabling readily available and trusted data to facilitate evidence-based insights and decision making. Since then, the Data Governance working group has developed a vision and mission for data governance, defined roles for data management, developed data management framework, and drafted a decision-making structure (RACI) and defined elements for a CIHR Business Data Glossary. The Data Governance Steering Committee approved the CIHR Data Management Framework, and a Data Steward Committee was launched whereby regular meetings were held to discuss data governance elements.

Although the data governance steps taken so far are necessary and beneficial for CIHR, specific data collection strategies at a program level would also ensure that CIHR is able to regularly report on and assess progress towards objectives and allow for evidence-informed decision-making beyond performance measurement and evaluation requirements (annually based on the IIR Program Information Profile and every five years, respectively).

CIHR’s OSP is being delivered in a cost-efficient manner

Grant expenditures, including administrative costs, have increased since 2010-11

Grant expenditures for the OSP (OOGP, FGP, PGP) have increased steadily from $420M in 2010-11 to $539M in 2017-18 (Table 1: Operating Support Program Expenditures (2011-12 to 2017-18) in Millions). Administrative costs have also increased steadily from $26M to $34M from 2010-11 to 2017-18 (Table 5: OSP Administrative Costs as Percent of Total Program Expenditure, 2010-11 to 2017-18). Administrative costs include the direct costs incurred by the Research, Knowledge Translation and Ethics (RKTE) Portfolio in delivering the OOGP, FGP, PGP, and indirect or internal services costs incurred by other Portfolios and Branches whose activities support the delivery of the program. Examples of internal services include staff in Corporate and Governmental Affairs, Communication and Public Outreach, Senior Executive offices, and the Resource Planning and Management Portfolio (e.g., Evaluation, Audit, RIU, Finance, Human Resources, and Information Technology).

The OSP’s annual total administrative costs translate into administrative costs per eligible application ranging between $5,386 and $8,427 and costs per grant awarded in the range of $32,887 to $55,529 since the last OOGP evaluation (2010-11). Applying the same methodology retroactively to OOGP data for 2010-11 shows comparable but slightly lower costs of $5,564 per eligible application and of $32,164 per grant awarded (Table 6: OSP Costs per Application and per Grant Awarded, 2010-11 to 2017-18).

Although a different method was used to assess cost-efficiency in the previous evaluation for OOGP (2012), it was concluded that OOGP was being delivered efficiently and that costs per application were in-line with the limited available benchmarks from other research funders. The full cost per grant application was $13,997 as compared to $18,896 for the Project Grant Scheme of the National Health and Medical Research Council (NHMRC; Australia) while the direct administrative costs per application were $1,307, $1,022, and $1,893 respectively for the OOGP, the NHMRC, and the National Institutes of Health (US).

The ratio of administrative costs to total program expenditures has ranged between 5-6%

The available evidence shows that the OSP has been delivered in a cost-efficient manner; the ratio of administrative costs to total program expenditures has ranged between 5.3% and 6.0% since the last OOGP evaluation in 2010-11. The previous evaluation also found the OOGP was being delivered efficiently; however, it should be noted that no improvements in efficiency have been observed since the 2012 evaluation. This is important for OSP given that one of the goals of the reforms was to enhance efficiency in a number of areas, including but not limited to peer reviewer workload, applicant burden and churn (Table 5: OSP Administrative Costs as Percent of Total Program Expenditure, 2010-11 to 2017-18).

Conclusions & Recommendations

The evaluation found that funding investigator-initiated research remains an effective means to support health research and build health research capacity. Given the recent programmatic shifts in the OSP, most notably the sunset of the FGP, the evaluation makes recommendations focused on informing and improving the design and delivery and performance of the PGP.

Conclusions

The OSP addressed a continued need for investigator-initiated health research

Given the nature and extent of the investment in the OSP, CIHR is addressing the continued need for the investigator-initiated health research. The evaluation found that CIHR investments in the OSP is aligned with Government of Canada priorities, which is reinforced by Canada’s Science Vision, the Fundamental Science Review, and the Federal Budget (2018 and 2019). Broadly, the OSP aligns with the CIHR Act, roles and responsibilities and the strategic directions of Roadmap II, specifically promoting excellence, creativity, and breadth in research.

The OSP contributed to advancing knowledge creation and building health research capacity

The evaluation found that the OSP has been attracting and funding research excellence (OOGP-funded researchers and FGP and PGP applicants are more productive and impactful than health researchers in Canada and other OECD countries). The evaluation also found that OOGP funding (across pillars; although majority are Biomedical grants) has successfully facilitated the creation, dissemination, and use of health-related knowledge (mainly within academia), as well as contributed to building Canadian health research capacity by increasing the number of researchers and trainees indirectly supported by these grants.

OOGP funded research results have demonstrated limited translation of knowledge beyond academia, longer-term health and socio-economic impacts

OOGP grants are not only expected to create knowledge but also to contribute to the dissemination, commercialization/knowledge translation, and use of health-related knowledge as evidenced by all OSP program objectives (OOGP, FGP, PGP). It is also a key priority for CIHR as evidenced in its strategic plan (Roadmap II, Strategic Direction I) and Act, which specifies the following related objectives: (h) promoting the dissemination of knowledge and the application of health research to improve the health of Canadians; and (i) encouraging innovation, facilitating the commercialization of health research in Canada and promoting economic development through health research in Canada. However, the evaluation shows that less than half of OOGP grants involve and impact stakeholders beyond researchers and study stakeholders as reported by NPIs through end of grant reports. Similarly, less than 15% of grants resulted in the translation of knowledge beyond academia, longer-term health impacts, or socio-economic impacts.

CIHR needs to better define and align the objectives of the PGP in relation to the CIHR Act

The OSP Program has undergone many changes since the launch of the new programs under the reforms, with several elements that have not been rolled out as intended (in terms of functionality and time – e.g., reviewer matching software, College of Reviewers), and with noted implementation challenges (Internal Audit Consulting Engagement, July 13th, 2016, Working Meeting with the Minister, PREP). Despite the broad alignment of the OSP with CIHR’s Act, the evaluation found that current objectives of the PGP lack alignment with the Act, specifically regarding building Canadian health research capacity. Whereas this was a specific objective for both sunset programs (OOGP and FGP). Given the major programmatic changes in the OSP, a review of the PGP objectives is needed given that it is the only remaining program.

CIHR needs to improve monitoring and assessment of the outcomes and impacts of its investigator-initiated research

While there is a wealth of application, competition, and implementation data available for the OSP (e.g., surveys about the application and decision processes), there is currently a lack of output/outcome data being collected to assess progress on objectives beyond the end of grant report (which is only administered 18 months post grant expiry). Programmatic changes to date have been made in the in the absence of performance data beyond the OOGP. Given the time it takes to observe longer term impacts, the end of grant report may not be the most effective approach to collecting this data. Furthermore, the evaluation shows that there are concerns about the availability and reliability of the data from the current end of grant report (e.g., self-report; low response rates; variability in completion times; overall length, structure, and type of questions included) and therefore the ability to accurately assess whether OSP programs are effectively achieving their objectives is limited. Although CIHR is making advances in data governance, challenges with data ownership and management (i.e., multiple units are responsible for the collection and dissemination of data) further affect the ability to monitor and assess program performance.

CIHR needs to ensure funding decisions are made equitably

The evaluation showed that there are differences in funding and outcome characteristics by pillar, gender, and career stage across individual OSP programs (OOGP, FGP, PGP) that need to be considered in the design and delivery of the OSP. Additionally, the Peer Review Working Group (2016) recommended equity across different career stages and sex of applicants. Although OSP funds researchers across pillars, the majority are from Pillar 1 (Biomedical). Male researchers have higher success rates than female researchers, and research shows sex and gender biases in funding decisions specifically related to the OOGP and FGP. Early career researchers have lower success rates compared to mid- and senior career researchers and English versus French language applications are generally more successful. The Policy on Results reinforces the need for equity through its requirements for considerations of equity, diversity and inclusion including sex and gender. CIHR is committed to addressing any unconscious biases in its processes to ensure equitable access to research funding (e.g., Interagency Committee on Equity, Diversity, and Inclusion; Interagency Committee on Early Career Researchers, Tri-Agency Statement on Equity, Diversity and Inclusion).

Recommendations

The evaluation makes three recommendations aimed at improving the design and delivery and performance of the PGP.

Recommendation 1

CIHR should revise the PGP objectives to ensure they are clearly defined, fully aligned with, and support key aspects of the CIHR Act related to building Canadian health research capacity.

Recommendation 2

CIHR needs to ensure that investigator-initiated research funding is distributed as equitably as possible while minimizing the potential for peer review bias. The design and implementation of investigator-initiated grants must account for differences within the health community observed by the evaluation (e.g., pillar, sex, career stage and language) and well as in the research more broadly.

Recommendation 3

CIHR needs to improve the monitoring and assessment of activities and investments in investigator-initiated research.

  1. CIHR needs to enhance the way performance data is collected related to capacity building (e.g., indirect support of trainees), knowledge translation beyond academia (i.e., informing decision making), collaborations, health impacts, and broad socio-economic impacts to better understand the full impact of grant funding.
  2. CIHR needs to revise the current end of grant reporting template and process in order to improve the availability, accuracy, and reliability of the data collected.
  3. CIHR should consider additional ways to collect data beyond end of grant reports via interim reporting as well as longer term follow-up to assess impact.

Appendix A - Tables & Figures

Table 1: Operating Support Program Expenditures (2011-12 to 2017-18) in Millions
Component Programs 2011-12 2012-13 2013-14 2014-15 2015-16 2016-17 2017-18 Total (%)
OOGP (MOP) $434 $450 $456 $493 $429 $308 $228 $2,797
(82.5%)
Foundation Grant (FGP)         $67 $133 $157 $358
(10.6%)
Project Grant (PGP)           $82 $154 $236
(6.9%)
Total OSP Investments $434 $450 $456 $493 $496 $523 $539 $3,391
(100%)
Total CIHR Investments $951 $941 $944 $960 $973 $1025 $1035  
OSP % of Total Investments 46% 48% 48% 51% 51% 51% 52%  

Source: CIHR Finance, April 2017; CIHR in Numbers, October 2018.

Note: Due to rounding, figures may not reconcile with other published information. Data does not include administrative costs or program costs of randomized controlled trials (RCT) and knowledge translation (KT) related programs that have been rolled into the open program. The former were rolled over earlier in 2009 and the latter were rolled over as part of the reforms.

Figure A: Timeline of CIHR Reforms Process, 2009-2017

Long Description

2009

  • Health Research Roadmap released.

2010

  • Review of health research practices conducted.
  • Development and discussion of new program design concepts begins.
  • CIHR Task Force established to oversee design/ implementation.

2011

2012

2009-2012

  • Consultations held through cross-country town hall meetings and other means, including engagement of: CIHR Institutes and their research communities, CIHR's University Delegates, Surveys, Chairs and Scientific Officers of Peer Review Committees and, CIHR Partners.

2013

2014

  • Establishment of Interim College of Reviewers Advisory Committee.
  • First Foundation Scheme "live pilot" competition launched.

2015

  • First Project Scheme "live pilot" competition launched.
  • Second Foundation Scheme "live pilot" competition launched.

2016

2017

  • PREP report received by CIHR, Feb 22.
  • Fundamental Science Review report submitted, Apr 10.
  • 2017 Foundation Grant competition launched.
  • 2017 Fall Project Grant competition launched.
  • College of Reviewers begins member enrolment, June.

Figure B: Application pressure and success rates across OOGP, Foundation and Project Grant programs, 2006-07 to 2017-18

Source: Overview of the Reforms to CIHR’s Open Suite of Programs: Peer Review Expert Panel - November 2016. Updated by CIHR’s Evaluation Unit and Results and Impact Unit, September 2018.

Note: T-OOGP refers to the transition year from the OOGP to the FGP (FDN).

Long Description
  OOGP T-OOGP FDN PJT
Fiscal Year 2006-07 2007-08 2008-09 2009-10 2010-11 2011-12 2012-13 2013-14 2014-15 2014-15 2015-16 2016-17 2015-16 2016-17 2017-18
Received (N = 50,474) 3,894 3,625 3,680 4,416 4,636 4,578 4,586 5,389 2,682 1,366 910 600 3,813 2,884 3,415
Approved (N = 8,625) 847 816 772 782 802 801 801 797 383 150 120 76 491 475 512
Success Rate 22% 23% 21% 18% 17% 17% 17% 15% 14% 11% 13% 13% 13% 16% 15%

Figure C: Average Number of Publications and Presentations by Pillar

Source: End of Grant reports from CIHR’s Research Reporting System, April 4, 2017, provided by the Results and Impact Unit.

Long Description
  Journal Articles Invited Presentations Other Presentations
Pillar 1 11.52 13.9 14.59
Pillar 2 8.23 11.87 14.98
Pillar 3 5.63 8.37 8.3
Pillar 4 8.81 10.29 14.39

Figure D: Average Number of Publications and Presentations by Sex

Source: End of Grant reports from CIHR’s Research Reporting System, April 4, 2017, provided by the Results and Impact Unit.

Long Description
  Journal Articles Invited Presentations Other Presentations
Male 11.4 13.48 14.28
Female 8.54 12.1 13.84
Table 2: Knowledge Products, Grant Duration and Amount by Pillar
  Pillar 1
(n = 2401)
M (SD)
Pillar 2
(n = 377)
M (SD)
Pillar 3
(n = 225)
M (SD)
Pillar 4
(n = 291)
M (SD)
Total
M (SD)
Journal Articles 11.52 (11.19)
Range: 1-125
n = 2353
(98%)
8.23 (9.87)
Range: 1-80
n = 334
(88.6%)
5.63 (9.35)
Range: 1-100
n = 190
(84.4%)
8.81 (17.23)
Range: 1-222
n = 248
(85.2%)
10.52 (11.70)
Range: 1-222
n = 3134
(94.9%)
Conference Presentations (Invited) 13.90 (16.66)
Range: 1-234
n = 2181
(92.8%)
11.87 (15.05)
Range: 1-108
n = 313
(84%)
8.37 (10.42)
Range: 1-70
n = 178
(79.1%)
10.29 (15.36)
Range: 1-153
n = 235
(80.8%)
13.07 (16.17)
Range: 1-234
n = 2916
(88.3%)
Conference Presentations (All Other) 14.59 (17.74)
Range: 1-245
n = 1303
(54.3%)
14.98 (26.25)
Range: 1-250
n = 252
(66.8%)
8.30 (11.46)
Range: 1-124
n = 154
(68.4%)
14.39 (23.63)
Range: 1-218
n = 216
(74.2%)
14.14 (19.44)
Range: 1-250
n = 1931
(58.5%)
Grant Duration 51.23 (11.71)
Range: 6-150
n = 2402
41.86 (13.64)
Range: 12-60
n = 377
32.80 (11.18)
Range: 12-60
n = 225
37.58 (12.42)
Range: 12-72
n = 291
47.72 (13.43)
Range: 6-150
n = 3304
Grant Amount 550,406 (219,690)
Range: 30,054-3,500,518
n = 2402
417,693 (284,691)
Range: 46,243-2,986,934
n = 377
303,969 (343,608)
Range: 40,000-4,730,412
n = 225
408,103 (450,647)
Range: 22,322-4,225,469
n = 291
506,590 (277,386)
Range: 22,322-4,730,412
n = 3304

Source: End of Grant reports from CIHR’s Research Reporting System, April 4, 2017, provided by the Results and Impact Unit.

Table 3: Knowledge Products, Grant Duration and Amount by Sex
  Male
(n = 2374)
M (SD)
Female
(n = 925)
M (SD)
Total
M (SD)
Journal Articles 11.40 (12.85)
Range: 1-220
n = 2275
(95.8%)
8.54 (7.50)
Range: 1-52
n = 856
(92.5%)
10.62 (11.70)
Range: 1-220
n = 3134
(94.9%)
Conference Presentations (Invited) 13.48 (17.39)
Range: 1-234
n = 2086
(87.8%)
12.10 (12.53)
Range: 1-106
n = 828
(89.5%)
13.07 (16.17)
Range: 1-234
n = 2916
(88.3%)
Conference Presentations (All Other) 14.28 (21.53)
Range: 1-250
n = 1285
(54.1%)
13.84 (14.43)
Range: 1-124
n = 647
(69.9%)
14.14 (19.44)
Range: 1-250
n = 1931
(58.5%)
Grant Duration 48.98 (13.22)
Range: 6-150
n = 2375
44.49 (13.43)
Range: 12-72
n = 925
47.72 (13.43)
Range: 6-150
n = 3304
Grant Amount 527,436 (268,967)
Range: 22,322-4,225,469
n = 2375
452,985 (291,673)
Range: 30,054-4,730,412
n = 925
506,590 (277,386)
Range: 22,322-4,730,412
n = 3304

Source: End of Grant reports from CIHR’s Research Reporting System, April 4, 2017, provided by the Results and Impact Unit.

Table 4: Research Staff and Trainees Involved in OOGP Grants
Type of Staff/ Trainee Total # Grants Total # HQP (paid/ unpaid) Avg # HQP (paid/unpaid)
M (SD)
Researcher 1878 (56.8%) 6836 3.64 (5.96)
Range: 0.1-201
n = 2051
Research Assistant 2046 (61.9%) 5782 2.83 (6.81)
Range: 0.2-251
n = 2402
Research Technician 2085 (63.1%) 3729 1.79 (1.62)
Range: 0.1-25
n = 2423
All Research Staff 3146 (95.2%) 16,347 2.26 (4.96)
Range: 0.1-251
Postdoctoral Fellow 2167 (65.6%) 4772 2.2 (1.89)
Range: 0.1-26
n = 2707
Fellows not pursuing Master’s or PhD 269 (8.1%) 518 1.93 (1.66)
Range: 1-12
n = 292
Post Health Professional degree 550 (16.6%) 1084 1.97 (3.16)
Range: 0.05-65
n = 602
Doctoral students 2566 (77.7%) 6744 2.63 (2.18)
Range: 0.25-25
n = 3296
Master’s students 2232 (67.6%) 5289 2.37 (2.02)
Range: 0.25-32
n = 2668
Undergraduate students 2112 (63.9%) 9832 4.66 (5.28)
Range: 0.1-62
n = 2631
All Trainees 3188 (96.5%) 28,237 2.85 (3.22)
Range: 0.1-65

Source: End of Grant reports from CIHR’s Research Reporting System, April 4, 2017, provided by the Results and Impact Unit.

Note: The different research staff and trainee categories are not mutually exclusive, the same NPI can and often does respond to multiple categories.

Figure E: Percentage of Grants Involving Research Staff and Trainees by Pillar

Source: End of Grant reports from CIHR’s Research Reporting System, April 4, 2017, provided by the Results and Impact Unit.

Long Description
  Pillar 1 Pillar 2 Pillar 3 Pillar 4
Research Staff 51 23 10 15
Trainees 79 10 4 7

Figure F: Average Number of Research Staff and Trainees per Grant by Pillar

Source: End of Grant reports from CIHR’s Research Reporting System, April 4, 2017, provided by the Results and Impact Unit.

Long Description
  Research Staff Trainees
Pillar 1 2.87 1.16
Pillar 2 2.89 4.71
Pillar 3 2.35 3.85
Pillar 4 2.92 4.25

Figure G: Percentage of Grants Involving Research Staff and Trainees by Sex of NPI

Source: End of Grant reports from CIHR’s Research Reporting System, April 4, 2017, and administrative data from CIHR’s Electronic Information System, June 23, 2017, provided by the Results and Impact Unit.

Long Description
  Male Female
Research Staff 66 34
Trainees 71 29

Figure H: Average Number of Research Staff and Trainees per Grant by Sex of NPI

Source: End of Grant reports from CIHR’s Research Reporting System, April 4, 2017, and administrative data from CIHR’s Electronic Information System, June 23, 2017, provided by the Results and Impact Unit.

Long Description
  Male Female
Research Staff 2.55 3.14
Trainees 2.81 2.98

Figure I: OSP Success Rates by Sex

Source: Overview of the Reforms to CIHR’s Open Suite of Programs: Peer Review Expert Panel - November 2016. Updated by Evaluation Unit and Results and Impact Unit in September 2018.

Long Description
  OOGP T-OOGP FDN PJT
  2006-07 2007-08 2008-09 2009-10 2010-11 2011-12 2012-13 2013-14 2014-15 2014-15 2015-16 2016-17 2015-16 2016-17 2017-18
Females (N = 16,404) 20% 21% 19% 17% 16% 15% 16% 15% 12% 8% 10% 13% 12% 17% 16%
Males (N = 33,999) 22% 23% 22% 18% 18% 19% 18% 15% 15% 13% 15% 13% 13% 16% 15%

Figure J: Percentage of OOGP Grants by Sex and Pillar

Source: Administrative data from CIHR’s Electronic Information System, June 23, 2017, provided by the Results and Impact Unit.

Long Description
  Male Female
Pillar 1 78 22
Pillar 2 61 39
Pillar 3 49 51
Pillar 4 46 54
Table 5: OSP Administrative Costs as Percent of Total Program Expenditure, 2010-11 to 2017-18
  2010-11 2011-12 2012-13 2013-14 2014-15 2015-16 2016-17 2017-18
Grants Expenditure ($) 419,263,388 433,577,824 449,679,548 456,160,828 493,198,143 496,146,559 522,474,148 539,349,771
Administrative Costs ($) 25,795,767 26,342,732 28,036,197 29,227,024 29,596,818 28,994,962 29,360,476 34,209,082
Total program expenditures 445,059,155 459,920,556 477,715,745 485,387,852 522,794,961 525,141,521 551,834,624 573,558,853
Administrative Costs as % of Total Program Expenditures 5.8% 5.7% 5.9% 6.0% 5.7% 5.5% 5.3% 6.0%

Source: CIHR Finance, April 2017.

Note: The cost of internal services was calculated as 3.5% of the Grants and Awards budget, in line with CIHR’s internal services allocation formula. Employee benefit plan (EBP) costs and accommodation costs were calculated at the Treasury Board rates of 20% and 13% respectively for both direct salary and internal services.

Table 6: OSP Costs per Application and per Grant Awarded, 2010-11 to 2017-18
  2010-11 2011-12 2012-13 2013-14 2014-15 2015-16 2016-17 2017-18
Administrative costs ($) 25,795,767 26,342,732 28,036,197 29,227,024 29,596,818 28,994,962 29,360,476 34,209,082
Number of applications 4,636 4,578 4,586 5,389 4,048 4,723 3,484 6,351
Cost per application ($) 5,564 5,754 6,113 5,423 7,311 6,139.10 8,427.23 5,386.41
Number of grants 802 801 801 797 533 612 551 916
Cost per grant awarded ($) 32,164 32,887 35,001 36,671 55,528.74 47,377 53,286 37,346

Source: CIHR Finance, April 2017 and Funding Policy and Analytics, July 2018

Figure K: OSP Success Rates by Career Stage

Source: Overview of the Reforms to CIHR’s Open Suite of Programs: Peer Review Expert Panel - November 2016. Updated by Evaluation Unit and Results and Impact Unit in September 2018.

Long Description
  OOGP T-OOGP FDN PJT
  2006-07 2007-08 2008-09 2009-10 2010-11 2011-12 2012-13 2013-14 2014-15 2014-15 2015-16 2016-17 2015-16 2016-17 2017-18
Early Career Investigators (N = 10,473) 21% 19% 20% 15% 16% 14% 16% 12% 11% 4% 12% 11% 11% 16% 15%
Established Investigators  (N = 40,001) 22% 23% 21% 18% 18% 18% 18% 15% 15% 16% 13% 13% 13% 16% 15%

Figure L: Percentage of OOGP Grants by Career Stage and Pillar

Source: Administrative data from CIHR’s Electronic Information System, June 23, 2017, provided by the Results and Impact Unit.

Long Description
  New Investigator Mid Career Senior Investigator
Pillar 1 16 40 44
Pillar 2 25 43 32
Pillar 3 34 44 22
Pillar 4 25 44 31

Figure M: OSP Success Rates by Preferred Language

Source: Overview of the Reforms to CIHR’s Open Suite of Programs: Peer Review Expert Panel - November 2016. Updated by Evaluation Unit and Results and Impact Unit in September 2018.

Long Description
  OOGP T-OOGP FDN PJT
  2006-07 2007-08 2008-09 2009-10 2010-11 2011-12 2012-13 2013-14 2014-15 2014-15 2015-16 2016-17 2015-16 2016-17 2017-18
English (N = 49,078) 22% 23% 21% 18% 17% 18% 18% 15% 14% 11% 13% 13% 13% 17% 15%
French (N = 1,396) 17% 16% 12% 18% 19% 12% 7% 12% 10% 6% 7% 18% 7% 8% 8%

Appendix B - Methodology

Overview of Methodology

Consistent with TBS guidelines and recognized best practices in evaluation, a range of methods and data sources were used to triangulate the evaluation findings. These methods included: document and data review; end of grant reports; case studies; and bibliometric analyses). When possible and appropriate, comparisons were made to findings from the previous evaluation of OOGP (2012). It should be noted that the use of interviews as a methodological tool when looking at the design and delivery of the new suite of programs was not undertaken given the number of consultations that had been ongoing since the reforms. For example, there were significant consultations with the research community during the reforms, large amounts of application/implementation data were collected from applicants through surveys, CIHR hosted the International Peer Review Expert Panel, and a Foundation Grant Program Review committee was struck in 2017. Therefore, under guidance from senior management at CIHR, additional interviews were not conducted, and instead results from these data sources and committees were used as inputs into the evaluation.

Document and Data Review

Relevant program, CIHR, and Government of Canada documents were consulted to provide context as well as to help address some evaluation questions related to relevance, performance, and design and delivery. These included (among others) the Fundamental Science Review (2017), Federal Budgets 2018-2019, Canada’s Science Vision, the CIHR Actand its current and previous Strategic Plan – Roadmap and Roadmap II, various design documents for the reforms, pilot evaluation results for the Foundation Grant Program, and the OOGP Evaluation (2012). Administrative data for the program, from CIHR’s Electronic Information System, was reviewed in relation to application, financial and competition information. End of grant report data, from CIHR’s Research Reporting System was also analyzed for the OOGP (more details below).

End of Grant Reports

Data on outputs and outcomes stemming from OOGP grants were collected via end of grant reports available through the RRS for OOGP funding competitions run between 2000 and 2013, with submission dates for these reports range between 2011-12 and 2016-17. The sample in the RRS database consisted of 3304 end of grant reports. A total of 13,331 grants were awarded during the time data were gathered for the OSP evaluation, representing the population of grants. Thus, the sample consisted of end of grant reports for 29% of grants awarded during the evaluation period. This lower rate is not surprising, given that the requirement for end-of-grant report completion was fully implemented starting in 2011, resulting in many reports either not being completed (i.e., for grants reaching their completion prior to this date), or not being completed within the specified 18-month period following the grant. The grant’s PI was required to complete an end of grant report within 18 months following the end of the grant funding period. Cases are defined by individual grants, rather than NPI’s, and NPI’s may have had more than one end of grant report.

The sample of end of grant reports had a representative distribution of gender, pillar, and language of the NPI with the population. Of the 3304 cases included in the sample, the majority (72%) had male NPIs, and a minority (28%) had female NPIs. A few (4 – 0.12%) did not specify a gender. Recall that the n refers to grants rather than individuals, meaning that there may be overlap among some of these NPIs. A breakdown by pillar revealed that the majority (73%) of grants were associated with the Pillar 1 pillar, with minorities for Pillar 2 (11%), Pillar 3 (7%) and Pillar 4 (9%). A few grants (9 – 0.3%) were not associated with a pillar (identified as Not applicable/specified).  The majority of NPIs on these grants preferred English as their primary language (87.5%) and majority were mid-career researchers (42%) followed closely by senior career researchers (38%).

Items from the RRS were matched to relevant evaluation questions and indicators, and quantitative analyses were conducted on the associated data. This data source was the primary line of evidence informing the findings related to the OOGP’s performance. Furthermore, the Canadian Academy of Health Science Impact Framework [ PDF (2.4 MB) - external link ] (CAHS, 2009) was used to guide the analysis of outcomes and impacts from the OOGP.It should also be noted that data from end of grant reports were disaggregated by pillar, sex, career stage, and language, with some comparative analyses undertaken when sample sizes were large enough. Some of the sample sizes were low for disaggregated data and in these cases, results are not reported (n <10).

Qualitative analyses were also conducted on a selection of six outcomes/outputs in the end of grant reports relating to the CAHS impact areas of Informing Decision-Making and Health Impacts. The six outcomes included: Information or Guidance for Patients or Public; Patients’ or Public Behaviour; Patents; Policies, Guidelines, and Programs; Professional Practice; and Vaccines/Drugs. A random sample of grants was selected from among those which specified an “advanced” outcome and for which an open-ended response was also provided. Approximately 40% of the total open-ended responses across the 6 indicators were included in the sample, with a range of 26-55% of responses sampled for each indicator to ensure that 1) included responses were balanced across pillars, and 2) a sufficient number of responses were analyzed for each indicator (i.e., at least 25 responses per pillar, per indicator, wherever possible). The open-ended responses were analyzed qualitatively for any emerging themes related to each type of output.

Case Studies

Eight high impact cases were purposively sampled for a case study analysis, to provide insight into how CIHR investments have led to the achievement of highly impactful outcomes. Cases were selected using end of grant data: the “most impactful” cases were identified based on a combination of self-reported indicators from the data across the 5 CAHS dimensions of impact (i.e., those grants identified as having the greatest number of impactful outcomes, ideally across multiple dimensions). Indicators included the number of journal articles resulting from the grant (top 1%; Advancing Knowledge), number of trainees/research staff associated with the grant (top 1%, Building Capacity); as well as outcomes reported “Advanced” outcomes resulting from the grants in all 5 CAHS categories (Advancing Knowledge, Building Capacity, Informing Decision-Making, Health Impacts, and Socioeconomic Impacts).

Bibliometric Analysis

A bibliometric study was conducted for this evaluation by the Observatoire des sciences et des technologies (OST) of Université du Québec à Montréal. The study provided data on the scientific productivity and impact of funded and unfunded OOGP applicants compared with other health researchers in Canada and OECD countries. To measure research productivity, the study examined articles published between 1998-2016 by a sample of funded and unfunded applicants who applied to the OOGP between 2000-2014 (Funded = 2000, Unfunded = 500). Two indicators were also used to measure scientific impact of applicants’ articles published between 2000-2016: average of relative citations (ARC) and average relative impact factor (ARIF).

Limitations

The following limitations should be noted:

Limitations Mitigation Strategies Impact of Mitigation Strategies
  • Inability to assess performance of FGP and PGP; use of OOGP to infer performance of new programs
  • Due to the stage of implementation and modifications to the FGP and PGP, a performance evaluation of some aspects of the OSP was not possible.
  • Information relevant to the FGP and PGP is presented wherever available, and findings are clearly linked to either OOGP or FGP and PGP where appropriate.
  • Performance results for the FGP and PGP programs are limited and are mainly presented in the context of OOGP performance results, where applicable.
  • Contribution vs. attribution
  • Attributing outcomes and impacts of grants solely to OOGP funding was not possible given that researchers have additional sources of funding and support, as well as additional possible confounding variables (e.g., field of research).
  • Thus, conclusions from this evaluation speak to CIHR’s contribution to trainee and researcher outcomes and impacts.
  • As the decision was taken to focus on contribution of funding to recipients’ outcomes, attribution of funding to these outcomes will not be discussed.
  • Reliability and generalizability of self-report data
  • Performance results are based largely on existing and available self-report data, subject to potential biases and recall issues, potentially limiting generalizability.
  • There are concerns with the reliability of some end of grant data due to variation in the level of completeness as well as the structure of the questions and length of the report.
  • Given the timeframe within which the end of grant report is administered (~18 months post grant expiry) it is possible that longer term impacts are not fully captured; however, few researchers felt they would occur in the future and the case studies also did not indicate longer term outcomes were realized (with a much longer period for follow up).
  • Findings from the RRS data are presented, but given potential data issues, generalizability beyond this sample may be limited.
  • Use of secondary data sources
  • Some of the data included was secondary data collected for different purposes, generated at different points in time, by different sources. These included the considerable data collected and analyses done on the FGP and PGP (e.g., pilot and quality assurance studies), the recommendations of the Peer Review Working Group and Peer Review Expert Panel (2017), and the end of grant report (2011-2016).
  • Wherever possible we ensured that the information was reliable, sources were clearly cited, and that multiple data sources were used as inputs, where available.
  • Efforts were taken to clearly identify variations in data sources, and data from different sources was used to triangulate findings where appropriate.

Appendix C – References & End Notes

References

Date modified: