Design Discussion Document - Proposed Changes to CIHR's Open Suite of Programs and Enhancements to the Peer Review Process - Long Descriptions

Figure 1: CIHR's top-down and bottom-up strategies and their objectives

The figure above shows CIHR's Top-Down and Bottom-Up funding strategies. It shows how both strategies are designed to achieve the full spectrum of CIHR's mandate, and how CIHR's peer review system plays an important role in achieving these strategies. The figure also contains text describing the objectives of the Top-Down and Bottom-Up strategies, and is described below:

CIHR's Top-down Strategy is targeted to specific areas of health research and knowledge translation. These programs and initiatives are intended to:

  • Focus on gaps in specific research areas and research communities or
  • Leverage existing strengths for impact

CIHR's Bottom-up Strategy is open to all areas of health research and knowledge translation. This suite of programs is intended to:

  • Capture excellence across all pillars
  • Capture innovative/breakthrough research
  • Contribute to improved sustainability of long-term research enterprise
  • Integrate new talent

« Back to figure 1


Figure 2: Amount of in-year Funding from the Open Suite of Programs held by Nominated Principal Investigators in 2010-2011

The figure about shows the amount of in-year funding (grants) awarded through the Open Suite of Programs held by unique Nominated Principal Investigators in 2010-11.

  • Of the more than 3,000 Nominated Principal Investigators with funding in 2010-2011, approximately 2,000 Nominated Principal Investigators held either one or a combination of two or three grants worth a total of less than 150,000 dollars.
  • The average amount of in-year funding held was $162,000 per year.
  • Approximately 1,000 Nominated Principal Investigators held one or a combination of two or three grants worth a total anywhere between 150,000 dollars and 350,000 dollars.
  • Approximately 200 Nominated Principal Investigators held one or a combination of two or three grants worth a total anywhere between 350,000 dollars and 750,000 dollars.
  • Approximately 50 Nominated Principal Investigators held one or a combination of two or three grants worth more than 750,000 dollars.

Note there is only one Nominated Principal Investigator per grant awarded, and that in-year funding does not include fellowships. CIHR defines a Nominated Principal investigator as a funded Nominated Principal Applicant. The definition of a Nominated Principal Applicant can be found in CIHR's Grants and Awards Guide.

« Back to figure 2


Figure 3: List of common challenges as identified by various CIHR stakeholders

The figure above lists common challenges in the Open Suite of Programs that have been experienced and/or identified by various CIHR stakeholders. These challenges include:

  • Funding Program Accessibility and Complexity
  • Applicant Burden/"Churn"
  • Application process/attributes do not capture the correct information
  • Insufficient support for new/early career investigators
  • Researcher and knowledge user collaborations not fully valued
  • Lack of expertise availability
  • Unreliability/Inconsistency of reviews
  • Conservative nature of peer review
  • High Peer Reviewer Workload

The majority of these challenges were brought to CIHR's attention through CIHR's Roadmap Consultations, the Institutes and their communities, CIHR's University Delegates, Surveys, the International Review Panel, Chairs and Scientific Officers; and, Partners.

« Back to figure 3


Figure 4: Stakeholder Satisfaction - Peer Review (Percent of respondents who provided an opinion)

Survey conducted by Ipsos Reid (2010) showing CIHR stakeholder satisfaction in Peer Review. ("Satisfied" category includes very satisfied, somewhat satisfied and neutral; "Dissatisfied" category includes somewhat dissatisfied and very dissatisfied). Based on the percent of respondents who provided an opinion:

  • 79% of peer reviewers were satisfied with the efficiency of the peer review process, while 17% of peer reviewers were dissatisfied with the efficiency of the peer review process
  • 70% of peer reviewers were satisfied with the fairness of the peer review process, while 26% of peer reviewers were dissatisfied with the fairness of the peer review process
  • 54% of applicants and grantees were satisfied with the quality of peer review judgements, while 44% of applicants and grantees were dissatisfied with the quality of peer review judgements
  • 40% of applicants and grantees were satisfied with the consistency of peer review judgements, while 58% of applicants and grantees were dissatisfied with the consistency of peer review judgements
  • 40% of institutional stakeholders were satisfied the consistency of peer review judgements, while 48% of institutional stakeholders were dissatisfied with the consistency of peer review judgements

« Back to figure 4


Figure 5: How the proposed design elements address multiple challenges in the current competition and peer review systems

This figure above shows how each of the proposed design elements for the New Open Suite of Programs and Peer Review Enhancements address the multiple challenges that have been identified by CIHR's stakeholders with the current competition and peer review processes. For instance,

The Foundation/Programmatic Research Scheme addresses:

  • Funding Program Accessibility and Complexity
  • Applicant Burden/"Churn"
  • Insufficient support for new/early career investigators
  • Researcher and Knowledge User collaborations not fully valued

The Project Scheme addresses:

  • Funding Program Accessibility and Complexity
  • Application process/attributes do not capture the correct information
  • Insufficient support for new/early career investigators
  • Researcher and Knowledge User collaborations not fully valued

The College of Reviewers addresses:

  • Lack of expertise availability
  • Inconsistency of reviews

The multi-phase competition process addresses:

  • Applicant Burden/"Churn"
  • Application process/attributes do not capture the correct information
  • High Peer Reviewers Workload

Application-Focused Review addresses:

  • Unreliability/Inconsistency of reviews
  • High Peer Reviewers Workload

Structured Review Criteria addresses:

  • Application process/attributes do not capture the correct information
  • Inconsistency of Reviews
  • Conservative Nature of Peer Review
  • High Peer Reviewer Workload

The remote (virtual) screening process addresses:

  • Lack of expertise availability
  • Conservative Nature of Peer Review
  • Inconsistency of reviews

Mainstreams integrated knowledge translation addresses:

  • Researcher and Knowledge User collaborations not fully valued

« Back to figure 5


Figure 6: Examples of programs of research that may be funded through CIHR's new Foundation/Programmatic Research Scheme

The figure above lists examples of potential programs of research that may have been funded through CIHR's new Foundation/Programmatic Research Scheme. These examples are:

  • A seasoned investigator who is studying pulmonary surfactant including biophysical approaches, tissue culture models, animal models, and potential therapeutic interventions
  • Three long-time collaborators who have a program of research studying usage patterns/efficacy, cultural dimensions and cost-effectiveness of alternative therapies for Diabetes in First Nations Communities
  • An integrated collaboration of researchers and healthcare providers who are developing, implementing and evaluating the effectiveness of mental health counseling delivery models for rural communities
  • A multi-disciplinary collaboration studying the effectiveness of clinical and behavioural interventions (including ethical dimensions) aimed at reducing rates of obesity for diverse population groups
  • A new investigator who is developing an e-health observatory to monitor the effects and effectiveness of health information system deployment in Canada
  • A group of investigators studying the determinants and control of cancer using genetic, proteomic and tissue culture approaches.

« Back to figure 6


Figure 7: Competition process for the Foundation/Programmatic Research Scheme

The figure above provides a high-level overview of competition process for the Foundation/Programmatic Research Scheme. The diagram shows the three competition process stages, with the following process steps:

Stage 1 – Screening Caliber of Applicant

  1. Submit Stage 1 Application
  2. Match application to reviewers
  3. Complete Stage 1 Review
  4. Results

Stage 2 – Screening Quality of Proposed Program and Support Environment

  1. Submit Stage 2 Application
  2. Match application to reviewers
  3. Complete Stage 2 Review
  4. Results

Stage 3 – Assessment

  1. Complete Final Assessment
  2. Decision

« Back to figure 7


Figure 8: Examples of research projects that may be funded through CIHR's new Project Scheme

The figure above lists examples of potential research projects that may have been funded through CIHR's new Project Scheme. These examples are:

  • The role of neuromuscular electrical stimulation in preserving muscle mass and strength in intensive care unit patients
  • Developing an evidence-based intervention that addresses the physical and psychological dimensions of disability for front-line healthcare workers
  • Care and Construction: Assessing Differences in Nursing Home Models of Care on Resident Quality of Life
  • Pharmacokinetic optimization of a new drug for the treatment of Type-2 Diabetes
  • Global Health Diplomacy: A Case Study of Canadian Efforts to Integrate Health in Foreign Policy
  • The impact of a brief motivational intervention on adherence behavior in asthmatics: A randomized controlled trial

« Back to figure 8


Figure 9: Competition process for the Project Scheme

The figure above provides a high-level overview of competition process for the Project Scheme. The diagram shows the three competition process stages, with the following process steps:

Stage 1 – Screening Quality of Idea

  1. Submit Stage 1 Application
  2. Match application to reviewers
  3. Complete Stage 1 Review
  4. Results

Stage 2 – Screening Feasibility

  1. Submit Stage 2 Application
  2. Match application to reviewers
  3. Complete Stage 2 Review
  4. Results

Stage 3 – Assessment

  1. Complete Final Assessment
  2. Decision

« Back to figure 9