Institute of Aging: Stakeholder Engagement Report to the CIHR Peer Review Expert Panel

November 2016

Main Messages

  1. All agreed that too much change happened at once.
  2. The CV module for reviewers is a long and complicated process
  3. The new system makes it harder for those that work at the crossroads of social/health. Aging is a very interdisciplinary area of research.

Stakeholder Engagement Approach

IA convenes biannual meetings with the directors of all research centres of aging in Canada. These meetings usually occur in March and October each year. The March meeting is generally done through videoconference, while the October meeting occurs in person at the annual Canadian Association on Gerontology (CAG) conference, where most of the directors are in attendance. IA's stakeholder engagement approach involved embedding the consultation into our scheduled October 2016 meeting at CAG. The stakeholder consultation questions were circulated to the 34 directors prior to the meeting so that they could reflect on them. The Directors raised many thoughtful points, which were recoded by IA staff during the consultation. This list of points was then edited and circulated by email to all directors so that corrections and additions could be made, and so that absent directors could add their voices to the consultation. These additional comments were collected by IA staff and incorporated into this document.

Participants

IA obtained input from approximately 15 of the Directors of research centres on aging in Canada. The directors are all senior-level researchers, and all have received CIHR funding at some point. These stakeholders are in direct daily contact with their respective aging research communities, and represent hundreds of researchers in the aging community across Canada. They are not only in close contact with researchers, but they are also closely connected to the research activities. Our stakeholder group was grateful for the opportunity to comment/be consulted.

Summary of Stakeholder Input

Question 1: Does the design of CIHR's reforms of investigator-initiated programs and peer review processes address their original objectives?

  • All agreed that the process was probably so criticized because the success rate was very low. It was clarified by IA that part of the problem was that cancelling a competition caused a surge of applications so that many more applications were received.
  • All agreed that too much change happened at once. Measures should have been put in place to get the same rigor as face-to-face review. Foundation scheme posed big problems in first round. One problem: very senior and junior researchers were lumped together.
  • The new system makes it harder for those that work at the crossroads of social/health. Peer review processes should be fostering more cross-disciplinarily. Particular impact on social disciplines. Social scientists who are doing collaborative work are feeling shut out. Aging is a very interdisciplinary area of research.
  • In terms on online review- there are certain improvements: i.e. less fatigue end of day, less travel
  • Very positive comments were received by one person with respect to the online system for peer review – because this person didn't like the old face-to-face system, this person felt that the new system offers more freedom for the reviewer. The old way was seen as damaging to the reviewer's ability to perform research since they would have to travel and be out of the office for more than one day.
  • There was concern that this process does not allow for cross review dialogue which can be critical in the review. One person suggested a component of dialogue be established but can be efficiently done via teleconference/video conferencing as opposed to in person.
  • The idea of foundation scheme is very appreciated – it offers more freedom/flexibility – one problem: the method for estimating the eligible amount does not make sense. There should be a calculator on the CIHR website to calculate. One researcher reached out to CIHR to understand how the budget is calculated and it took six months to get a response. There needs to be clarification and transparency on the estimation of the eligible amount.
Question 2: Do the changes in program architecture and peer review allow CIHR to address the challenges posed by the breadth of its mandate, the evolving nature of science, and the growth of interdisciplinary research?

  • The CV module for reviewers is a very long and complicated process. The onus is put on reviewers to do a lot of work before they even begin to review.
  • People are asked to review on multiple rounds. How sustainable is this?
  • An erosion of the peer review process - if you keep asking same people, they will start to decline, so you will then have to ask for your 2nd, 3rd, 4th, or 5th choices.
  • It was cited as an example of a gap that no one knows how to evaluate research coming out of a college. It is also difficult for a college to apply since some components of the application are not relevant to colleges.
  • Reviewers are often not well matched to the sector or topic they are asked to review.
  • Scientific Journals are also experiencing these recruitment problems- 'the hunt for reviewers'
Question 3: What challenges in adjudication of applications for funding have been identified by public funding agencies internationally and in the literature on peer review and how do CIHR's reforms address these?

  • Perhaps CIHR will eventually have to pay reviewers. Others felt that this is part of academic responsibility, and that paying for reviews is a slippery slope to go down
  • University recognition is not enough. Peer reviewers should be better recognized/awarded for their efforts
  • Challenge of recruiting reviewers from Canada's smaller research-intensive universities
Question 4: Are the mechanisms set up by CIHR, including but not limited to the College of Reviewers, appropriate and sufficient to ensure peer review quality and impacts?

  • It was felt that CIHR reacted too quickly—threw the baby out with the bathwater. There was consensus on this among the group.
Question 5: What are international best practices in peer review that should be considered by CIHR to enhance quality and efficiency of its systems?

  • Belgium feeds in content of the CV module for reviewers and their web-based system is very organized. Reviewers need only to perform the review and not input so much data. Payment is €200 Euros. Belgium also provides the comments of other evaluators – each reviewer then feels they are part of the process. Almost like a journal review—very nice to feel part of the process.
  • Ireland provides a link and a PDF to documents, very simple.
  • Common CV structure not easy to read (or provide)—would much prefer a regular CV. CCV is not aligned with actual CV's. Some researchers would prefer to provide their whole career, not just past 5-7 years. Others would not support this, since there could be an enormous amount of irrelevant content provided for senior researchers. Relevant CV content in terms of the application can be incorporated into most significant contributions if outside the 5-7 year time frame. Better guidelines on how to develop/set up the most significant contributions page would help. Seems like very inconsistent approach to this important page of CV.
  • Some stated that they do like the CCV – although there is agreement between all that there are too many versions of the CCV
  • Saskatchewan Health Research Foundation used a web-based in-person model for their review of applications and it worked well - one chair, and someone else in person, and rest are online. This could be used for the CIHR in-person model rather than spending the time and money necessary for reviewers to come to Ottawa.
Question 6: What are the leading indicators and methods through which CIHR could evaluate the quality and efficiency of its peer review systems going forward?

  • Open this survey to the research community of Canada
  • Scan international best practices
  • Reliability – to what extent do the two reviewers agree?
  • Reviewers should have good knowledge of the area that they are reviewing
  • Look at outcomes of grants in the long term and see how they relate to the PRP rating
  • Currently the PI does not have administrative rights on their grant- it can be edited without their knowledge by other co-investigators- this should change
  • Should compare with other similarly sized countries
  • Seems that basic scientists are most against the reforms. They perceive that their funds have been taken away and given to other areas that are less important/useful (KT, etc.)
  • Presence of a reviewer with interdisciplinary experience
Date modified: