Investigator Grants provide the highest-performing researchers at all career stages with consolidated funding for their salary (if required) and a significant research support package.

In this way, the Chief Investigator has flexibility to pursue important new research directions as they arise, adjust their resources accordingly, and to form collaborations as needed, rather than being restricted to the scope of a specific research project.

Investigator Grants support:

  • research across the four pillars of health and medical research:
    • biomedical
    • clinical
    • public health
    • health services research
  • all career stages:
    • early career researchers
    • mid-career researchers
    • established researchers
  • researchers with clinical responsibilities (such as clinicians, public health and allied health practitioners)
  • full-time and part-time researchers.

Investigator Grants aim to:

  • allow flexibility for investigators to pursue important new research directions as they arise and to form collaborations as needed
  • foster innovative and creative research
  • create opportunities for researchers at all career stages to establish their own research programs
  • reduce application and peer review burden on researchers.

Working towards gender equity in Investigator Grants

Following NHMRC's national consultation during 2022 on options to reach gender equity in the Investigator Grant scheme, NHMRC has implemented additional special measures under the Sex Discrimination Act 1984 to address systemic disadvantage faced by female and non-binary applicants to its Investigator Grant scheme. For the 2024 round, NHMRC will again:

  • continue to use structural priority funding for women at the Emerging Leadership levels of the scheme (EL1 and EL2) to the extent necessary to achieve gender equity targets
  • for the Leadership category (L1, L2 and L3 combined), adopt a new intervention with the target of equal numbers of grants by gender
  • include non-binary researchers alongside women in both gender equity interventions (structural priority funding for Emerging Leadership grants and the new intervention for Leadership grants)
  • adopt a single Research Support Package of $400,000 for all Leadership grants.

Further information on the findings of the consultation is available in the NHMRC consultation report: Options to reach gender equity in the Investigator Grant scheme.

Funding support

Investigator Grants may be funded by or in conjunction with other organisations. These grants offer opportunities to researchers whose work is particularly relevant to the priorities and research interests of the partner organisations.

Information on how organisations and individuals can support NHMRC to fund health and medical research is available on Working together to support health and medical research.


Grant at a glance

Duration of funding
5 years

Key changes

Key changes in 2024

2024 Investigator Grants scheme dates

As planned and communicated since late 2022, NHMRC opened the 2024 Investigator Grant round in September 2023. This is in response to sector feedback about the schedules of our largest grant schemes and aims to enable Investigator Grant outcomes to be advised prior to 2024 Ideas Grants closing.

To enable this, the 2024 Investigator Grant scheme opened for applications on Wednesday 13 September 2023.

  • Minimum data was due on 11 October 2023 17:00 (AEDT).
  • PhD census date is the close of applications.
  • Applications closed on 9 November 2023 17:00 (AEDT).
  • Conflict of interest and suitability declarations will occur in late November/early December.
  • Assessments will commence in late January 2024.
  • Outcomes are expected to be advised in early May 2024 ahead of the Ideas Grant scheme closing.

Further details on this grant opportunity, including the grant guidelines, will be available on GrantConnect when the scheme opens.

Each year, we seek feedback from peer reviewers and receive feedback from applicants on peer review policies and processes. This feedback contributes to a detailed policy analysis to determine whether our grant program is achieving its objectives and if any policy adjustments are required.

Key changes will be detailed in section 2.1 of the Investigator Grants 2024 Guidelines available from GrantConnect in September 2023.

Peer review

For details regarding the Investigator Grants 2024 peer review process, refer to the Investigator Grants 2024 Guidelines on GrantConnect.

As part of NHMRC's ongoing support to peer reviewers, the Investigator Grants Peer Review Mentor (PRM) video features senior researchers discussing their approach to organising, assessing, scoring and comparing Investigator and other NHMRC grant applications equitably.

The 2023 CEO peer reviewer briefing webinar highlights the key changes to peer review in the 2023 round, outlines the legislative obligations of peer reviewers and provides helpful hints and tips for conducting fair and impartial peer review.

In our 2024 CEO Introduction webinar, NHMRC’s CEO, Professor Steve Wesselingh, and Research Foundations Executive Director, Dr Julie Glover, welcomed 2024 Investigator Grants peer reviewers. They shared tips and tricks for quality peer review and outlining what resources are available. You can read a copy of the webinar transcript, or watch the webinar on Vimeo.

Mentor video

The Investigator Grant Peer Review Mentor video follows:

Introduction by Doctor Julie Glover, NHMRC Executive Director, Research Foundations

Welcome to the Investigator Grant Peer Review Mentor Video. This video is part of the training and support available to peer reviewers that will help you prepare for the types of issues you'll need to consider during your independent assessments. Our peer review mentors are strong leaders in health and medical research from a range of different research fields who have experience with peer review and have agreed to share their thoughts and approaches.

We've asked our mentors a series of questions based around themes that come up commonly in the peer review of investigative grants, but also in NHMRC peer review more generally, you'll see from our mentor responses that there are different ways that reviewers approach their tasks, but you'll also see that there is a common thread of reviewers striving for quality, consistency, and fairness in their approaches. This video will address some of the key changes made to the peer review of investigative grants. These include a change in publications track record assessment which will shift the focus from the quantity of the publications to the quality of the research and the contribution to science. The other key change that will be discussed by our mentors is the clarification on how to assess those applications that you feel have been submitted at the wrong Level.

It is essential for the fairness of peer review that our assessments of application track records continue to be assessed relative to opportunity. Our relative to opportunity policy recognises that not all research careers are the same and that an applicant's contribution and research productivity should be assessed commensurate with the opportunities that have been made available to them. Please also remember to give full consideration to career disruptions, particularly where these can have an ongoing impact on the research career.

To help us identify important improvements for future realms, we really appreciate your participation in the peer review survey which will be available to you following the completion of your assessments. NHMRC is committed to improving our processes and the feedback that we receive through this process will be used to inform potential improvements to the scheme. We hope that this video addresses some of the key questions you may have and helps you in completing your assessments along with the category descriptors and other documentation you have been provided. Thank you for taking the time to view this video and for your important contribution to independent NHMRC peer review.

Peer Review Mentor introductions

Prof. Patsy Yates: I'm a registered nurse and have been working in clinical education and research aspects of nursing and health services with a particular focus on cancer and palliative care research. 

Prof. Yvette Roe: I'm a Njikena Jawaru woman from the West Kimberley. So I'm a very mid-career first nation researcher whose work is focused in health service redesign. 

Prof. Nicholas Talley: I'm a gastroenterologist. I'm a clinician researcher and I've been doing research into gastrointestinal disease particularly inflammatory diseases recently in the microbiome for quite a number of years. 

Prof. Jennifer Stow: I'm a molecular cell biologist. Been doing cell biology research for over 30 years. 

Prof. Sarah Russell: I'm a immunologist cell biologist.

Prof. Christopher Fairley: For the last 20 years, I've been director of Melbourne Sexual Health Centre and carrying out research in an area of sexual health and public health. 

What makes a good application?

Prof. Patsy Yates: So to me, an excellent research program in terms of knowledge gain is research that makes an extremely significant contribution to knowledge in a field that in terms of addressing a particular problem, it's not necessarily about research that is about the most prevalent problem in our community, but it's research that truly is adding significantly to our understanding of a particular health issue and is ultimately creating advances in how we can improve health of our community.

Prof. Yvette Roe: The characteristics of an excellent research program specifically to knowledge gain includes clarity of point that the researcher is writing about, that it's well supported, that the statements being made can be justified and verified, that I can coherently understand what the researcher is trying to do and how they're planning to do it, and also that there's a clear gap in evidence and the proposed research program will fit into that gap in knowledge. So clarity, acceptability, relevance, and also priority, especially in first nation health.

Prof. Nicholas Talley: I must admit, I am influenced by how easy to read the application is, how clear the hypotheses and aims are, how logically if the program follows a plan, how the importance and the relevance of the research is made very clear, how the methods are going to be robust. It doesn't have to be very detailed for this, but it has to be convincing.

Prof. Sarah Russell: I think it has to be exciting. If you don't walk away wanting to think more about it, then that's a bit of a problem, and unique and with a very clear focus on a big picture of relevance to health.

Prof. Christopher Fairley: So I think there's one overriding characteristic that no grant can get up without, and that is it has to be easy to read and clear to someone who is not an expert in that field. And if you can't do that, if you can't create a grant that someone can read and understand without laboriously going back and forth, you just won't score highly enough to get funded in my particular view.

Using the Category Descriptors

Prof. Yvette Roe: Category indicators are going to be your guide to assessing the document. Each research application deserves to be assessed individually. What distinguishes a research proposal from outstanding to excellent is often those fine tune things that there is clear program of design, there's solid scientific contribution, the team is leaders in their field and is going to make a significant impact, and those things are well justified by the author.

Prof. Nicholas Talley: It's very much a judgement call, and that's why it's so important to look at those descriptors very, very carefully before doing any of the grants. And again, refreshing your mind each time you look at it. Looking at each grant independently, not relating the grants to each other.

Prof. Jennifer Stow: I try to review each application on its own merits to begin with, and then later on, once I've done them all on their own merits, go back and try to compare them with each other and make sure they're all ranked appropriately. I think any truly exceptional applications stand out immediately. 

Track Record

Prof. Patsy Yates: Exceptional to me in publications is really around about that sort of the quality and the contribution that that publication is made to science. So here I look, for example, at how significant and original the publication was, how it added to new knowledge around a particular problem, how it's been used by other people, taken up or influenced other people, for example, or other researchers in the field. Metrics aren't necessarily the key to assessing always how significant publication is. So I look very carefully at how an applicant has described the significance and using the narrative and the explanation, you can often get a feel for how significant that publication has been.

Prof. Christopher Fairley: This is a relatively small amount of text to look at, but it's really important that assessing these publications is done thoughtfully and fairly. And you can't just look at them and say, "Gee, that's great. They’ve got two Lancets, two New England and a JAMA”. Seven out of seven or whatever the top score is. You’ve got to put some thought into it, and the key is what is this applicant's contribution to these papers?

Publications: shift to top 10 in 10

Prof. Patsy Yates: So how I see this is that I'm coming to do a peer review where I'm bringing my understanding of my disciplinary context. I'm bringing my understanding of publication practices. I'm bringing my understanding of the significance of a particular research finding or publication to the field, and that's what I'm taking into account when I'm looking at the person's description and their explanation of what that contribution has been to science. When I'm looking at the individual and how the individual has explained what their particular part of that particular publication is as well.

Prof. Sarah Russell: So remember, you are looking for the quality of research and the contribution to science. Well, to my mind, this is a very welcome shift in emphasis from quantity to quality, and it'll hopefully make it much easier to address the publication record. Exactly how you determine the quality and contribution of each paper is no different than it has been for a very long time, the only change is that there's no longer an opportunity or responsibility to factor publication numbers into the score. 

Assessing Research Impact 

Prof. Patsy Yates: And I think it really does reflect an important change in the way we think about the value of research to our communities. And what's great about the development of this in NHMRC peer review processes is that we've thought about impact in terms of not just knowledge gain, but also in terms of health impacts, economic impacts and social impacts.

Prof. Christopher Fairley: There are three areas for research impact, reach and significance, research programs contribution, and the applicant's contribution. So these three sections, I'm trying to get a feel for, okay so, what is it this person's claiming? What is their individual contribution to it? What is the contribution of a particular research program? So they're ahead of the entire program, it's relatively easy, but they might just be one part of an entire research program and I'm trying to get a feel for those three things, as I read through it. It's very important that it's clearly written, it makes sense, it's easy to read and the claims are reasonable and entirely justifiable.

Prof. Patsy Yates: So this is actually where, as reviewers, your expert knowledge of the context in which that particular impact is being undertaken becomes really important.

Prof. Yvette Roe: Try not to force things that aren't naturally forced because you'd like it to be that. Assess the document with what's in front of you.

Prof. Nicholas Talley: Research impact, again, can be quite difficult because of the varying groupings. Some are easier than others and I think that's one of the things, one of the traps for the assessors here. So it's really important to look at which areas have been picked for research impact and how they've been justified.

Prof. Jennifer Stow: The evidence, I think, is important always and that has to come from the quality of the outputs themselves, any measures or metrics that people can provide. And also, I think, in addition to those things, plain English statements from the applicant describing their impact are really, really powerful for reviewers.

Assessing Leadership

Prof. Yvette Roe: Track record on leadership should be considered in a very diverse way. Leadership might not be at the front of the pack leading a whole team, but it might be attributes that the author has described. Critical ability, able to bring a team together, able to do a collective effort, able to be an advocate or a specialist in a team. So be mindful that when you assess leadership, that you're looking at a very diverse definition of what that might look like.

Prof. Jennifer Stow: And you'd expect leadership to be commensurate with both career level and opportunity, and also the level of impact someone's having. So you like to see that people are not only having impact in their own, right, but exerting leadership in a way that brings other people along with them, provides opportunities for trainees and students and grows research capability around them as they progress through their careers.

Assessing overall Track Record

Prof. Sarah Russell: So rather than saying, for instance, that you can't have an excellent without at least three first author papers a year, I would say there are several reasons why you might score in that top category. And if you miss one, so you haven't got quite that many last author papers, you can still be in the running by, for example, articulating how you were a critical piece of papers on which others were first or last.

Similarly, for things like leadership, you can say you've built up an enormous team and won loads of grants and being a keynote speaker in lots of very important conferences. But then there are lots of reasons why you might not be a keynote speaker in lots of different conferences so they could be personal reasons that you just don't travel or other things. There's got to be many ways where you can achieve these scores and one example would be in leadership where I'll ask the question, "What have you done that goes beyond expanding your own success?"

So I look for a degree of altruism here in the leadership section, and there's many ways and many people do a lot of leadership that it goes beyond their own success, and I think it's very important to reward that. So whether it's that they've realised that Australia is missing a key network that connects up people with a particular skill or resource, or whether it's nurturing a particular cohort of people that needs a bit of help to get them over the line, there's a lot of things that are not just making sure that your postdoc wins a fellowship.

Assessing Track Record Relative to Opportunity

Prof. Sarah Russell: I think it's worth keeping in mind that to provide a score for track record, we are tasked with converting qualitative information on both achievements and opportunity into a single number. At one level, there's a temptation to look at this task as first, scoring achievements as a whole. Second, scoring opportunity as a whole, and then somehow modifying one number by the other. I don't find this a helpful approach. More valuable to me is to remember that our primary goal is to ensure value of the public's money by using past achievements as a reflection of a future potential. And in that light to consider each of the applicant's achievements in light of their opportunity.

Prof. Patsy Yates: So when I'm doing a relative to opportunity assessment, there's many things that I take into account and probably one of the most important points that I like to frame this assessment is in a holistic way because when you think about sort of a career interruption or something that's impacted on your opportunity to achieve research outputs, for example, they're often related. You can't just sort of take a three month gap or a six month gap and say that's going to have the same impact for everyone, because that might also have impacted subsequent research activity that a person may have had the opportunity to contribute to.

Prof. Sarah Russell: And again, if the applicant's been in and out of the workforce due to young children for a few years, I recognise that as well as the reduced publications over that time that's accommodated in the application, there'll also be missed opportunities to recruit students and submit grants and so on and that'll impact on their productivity for years to come. So again, I consider that as part of the decision as to how impressive was their achievement in light of that track record.

Prof. Yvette Roe: Not all of us have a very traditional pathway when we think of our research career, so relative to opportunity must consider not only their ability to be part of research team, to produce outputs, but also some of the personal considerations. So people come into research with a whole range of social commitments, be mindful of that. Research is not a linear trajectory, so be mindful that people will have their own journey and does the author, again, justify their opportunities to participate as a high performing researcher.

Prof. Nicholas Talley: You really need to be mapping out the particular applicant's course and trajectory from all of the material that's available to you in the application. And it's so important here to be as objective as possible, and obviously to look at the applicant circumstances. I think it's important that peer reviewers don't compare themselves or their colleagues or their mentees or mentors as they're thinking about the relative to opportunity scoring. I think it's really important to recognise every situation really is quite different, and you have to judge each career path and trajectory individually.

For example, if they're clinician researchers in their career overview statement or their teaching load if they're an academic and that needs to be taken into account. Sometimes that's a little bit difficult to judge particularly admin load and what that means, but again, hopefully the applicant has expressed this clearly. So you understand how to adjust your scoring depending on all of this.

And then you have to use your experience to understand the impacts of all of these factors, teaching clinical duties, moving labs, research years and the likely productivity issues as you judge it. I think this is one of the most difficult things to do in an application, but critically important.

Prof. Jennifer Stow: So I think the new career context section in the application is very helpful in putting relative to opportunity considerations in context of someone's whole career. So it makes it easier to see what impact career disruptions have had at different times in someone's career. And for careers that might be research focused or teaching in research or clinical, I think the new career trajectory, career statement helps you to judge relative to opportunity for all those different types of careers much more clearly.

And so I think for new reviewers having to assess this, it's a matter of looking at individual justifications and career trajectories and seeing overall how much opportunity, how much time, how much bandwidth people have had to spend on their research and what their productivity is relative to that, what their leadership is relative to that, what their impact is relative to that.

Assessing applications across different fields

Prof. Patsy Yates: You've been chosen because you're an expert reviewer. So you are bringing your understanding of what's relevant to that field, what's appropriate in that field. So my advice is, again, that you bring that understanding of what's relevant.

Prof. Yvette Roe: My work is in health service to research. Often, I'm looking at apples and oranges. So when I look at that, I'm trying to look at each application in its own merits. What I think I know about a field is really important, so I try not to make assumptions. Again, recognising my biases, what are those biases based on? So ensuring that what's in front of me, I read independently that I make assessment and I question why I make those assessments so the applicant, even if they're not in my field, has the best way of getting an independent score from me.

Prof. Nicholas Talley: This can be somewhat difficult, but I think, again, if you are clear in your mind before you start the process of peer review, what your expectation are according to the field or area, I think this can help you to judge these fairly across all of the applications that you are assessing. I mean, it wouldn't be reasonable to expect a basic scientist to have an impact on clinical practice directly. It does happen actually, it can happen, but that would not be expected and in fact will be unusual. So if it's basic science or research programs that you are looking at, that is a reasonable expectation to have in your mind before you start.

Prof. Jennifer Stow: So judging research opportunities and quality of impact and things like that across different career paths can be quite difficult. I do think the new career context statements make that easier. It's easier to see how much time clinicians have had to spend on research and compare them to people who are doing teaching in research or research only. I think given the whole package of somebody's application, that's what's important to take into account when you're comparing clinical researchers, public health researchers, basic researchers, and in all their different career positions. 

Benchmarking applications

Prof. Yvette Roe: Be mindful that the applicant has spent a significant amount of time to present a program of work, and they deserve your time and commitment to do a solid assessment of it. I will usually run through and do a first read of all the grants that I'm reviewing, and then I'll make time to go actually a deep dive and then review them independently in a lot of detail. This is a time consuming task, but we want to make sure we make solid informed decisions regarding these applications.

Prof. Jennifer Stow: Reviewing a lot of different applications can be quite daunting. I personally scan through them in the beginning and try to find one that's a really clearly very good one and well, that's a weak one. And I do those first to set the benchmarks at either end, and then that helps to position the others in between those outliers. Then I go back and sit down with them all and look at them all and see how I've ranked them according to each other, and if there's ones that are anomalous in that whole cohort, that I might adjust them.

Prof. Sarah Russell: I think there can be a concern with letting your view of a person's track record influence your judgement of the knowledge gain section, the proposal. For this reason, I try to score the knowledge gain section before the track record and sometimes I try to do all the knowledge gain sections, create a score for those, and then go back and review each applicant for the track record sections after that. And that way you can get your mind a little bit more focused on one particular aspect and achieve a bit more consistency that way.

Prof. Christopher Fairley: The first point I’d make is be under no illusion that this is hard work and requires a lot of time and a lot of commitment. What I do to try and make them as fair as possible is I tend to do the sections all together. So when I'm doing publications, I’ll have a quick read of the application, then I'll score the publications for that one. I'll go to the next one, quick read, score the publications. I'll score all the publications of all of them together, then I'll come back and I'll do the research impact all together. Then I'll come back and do the knowledge gain all together.

So I try and not do the application, if you would like, in a sort of linear way, but I'll go across the same section across all the applications and come back and do it that way. And I think for me, that was much more helpful.

Assessing applications at incorrect Level

Prof. Patsy Yates: What do we do if we sort of feel that it is not quite right? Well, I think that's where we need to take into account that we are doing an assessment on relative to opportunity. So in some ways, by taking that relative to opportunity assessment into account, we're really going to score people on where they're at in their career, not necessarily needing to worry about sort of benchmarking in its particular Level, it's really we are looking at their career and where they're at. 

Prof. Sarah Russell: Well, you are asked to take into account the Level they applied for, but the guidelines are very clear that it's up to you to take into account a range of considerations when scoring. So if the Level applied for doesn't feel right in the context of their descriptions of the category level justification and all the other relative to opportunity considerations in the application, I would probably give it a much lower weighting. And from that perspective, I think you should be very comfortable ranking them against the cohort that you feel they do belong to. 

Single Summary Statement/Feedback

Prof. Patsy Yates: We want to enable them to develop their research in ways that continues to improve. So if you put yourself in their shoes and you think about, well, why are you providing this feedback? I think that's always a really good starting point. You should be able to defend that feedback if you were providing that feedback face to face to someone, how you would be able to defend that.

Prof. Yvette Roe: We know that research can be better, it can be tighter, it can be more concise and we want it to make an impact. So be mindful when you're writing those comments, how are you going to provide constructive feedback to the applicant that allows them to enhance better their grant, whether it's successful or unsuccessful, we've got a community and we want to support better research. How will your comments contribute to that?

Prof. Jennifer Stow: We can in that space highlight strengths and weakness particularly if there's something we think they have taken the wrong approach to in the application or left out some very meaningful information in their justifications or their career trajectory statements.

Prof. Christopher Fairley: You should stop and think for a minute what was it that made me score this application the way I scored it? I think you've got to focus on why it was that you felt that this application wasn't very good and then provide some constructive evidence. The best way to do that is to start up the sentence with the application could have been improved if there was more. So you're not saying there was none or it was terrible or anything, just saying if there was more attention paid to explicitly describing the evidence that supported their claim, which I think was probably the single thing that's done the worst.

Unconscious bias

Prof. Patsy Yates: The most important way of addressing unconscious bias is actually through self-awareness, and that's so critical that you are aware of what biases you might bring to the assessment of any application. I think once you're aware of them, it does, again, help you when you are looking at your overall assessments to ask yourself that question, have any of my biases perhaps potentially influenced the way in which I've undertaken that assessment? I think unconscious biases is also something that one of the ways in which I have tried to address unconscious biases in the past is just simply by talking about those with other people as well.

Prof. Yvette Roe: Do not be frightened by your unconscious bias, we all have it. We have it when we walk down the street, we have it when we look at a TV show and we have it when we review applications. Be mindful about some of the things about your assumptions that you're making about either the discipline, how the problem might be solved, who's in that team. Address those biases, be honest with and ask yourself, "Can I do this, an independent review to the best of my ability?" So the thing would be to address them and also make the honest assessment, how does this impact on the score that you're giving them?

Prof. Christopher Fairley: I think if you're aware that you might have unconscious biases, so you make them conscious, then you can deal with them.

Overall approach

Prof. Patsy Yates: I think we take for granted how important it is that good peer review is what will enable us to undertake the best research in the community to achieve the best health outcomes for our community. So it's not something that's trivial, it's a really important part of the research process, and I think that means taking on board all of the guidance and advice and support and development opportunities that NHMRC is providing peer reviewers, because that means that that service that you're providing as a peer reviewer will have the best possible impact that it can.

Prof. Yvette Roe:The first time that you will do this, you'll be nervous. And you should be, it's a big responsibility. The thing is NHMRC is about ensuring that we're supporting the best research, but we're also encouraging the best researchers. As a grant reviewer, this is a really exciting opportunity for you to contribute your expertise, your understanding, and your vision for research in Australia. Ensure that you will be able to provide a really insightful, independent and terrific experience both for yourself and for the applicant.

Prof. Nicholas Talley: Our job as reviewers is to do the very best we can by each individual to avoid bias where we possibly can to consider each proposal fairly individually and as expertly as we can. 

Prof. Jennifer Stow: So it's important not to be overwhelmed by the process, take a quick scan through the applications when you get them, but then assign times to go through them in detail and a time for looking at them all at the end. 

Prof. Sarah Russell: I think it's a really healthy thing in the system that we've got at the moment, that there is an opportunity for different people to have different views. And there's no question that there are many ways to be a great medical researcher, and there are many ways to perceive what is a great medical researcher. 

Prof. Christopher Fairley: These applications matter. These are people's lives and livelihoods, and it is important that we do what we can to get them right. So these things matter and you shouldn't take this job on, unless you are prepared to do it properly. And to do it properly, you need time to think and do it carefully and be self critical of your reviewing along the way, and try to do it in a way that is as fair and reasonable as possible.

Briefing webinar video

The 2023 Investigator Grant CEO peer reviewer briefing webinar video follows:

Facilitator welcome

Dr Julie Glover (Executive Director, Research Foundations, NHMRC):

Welcome, everyone. And a big welcome to you to this 2023 Investigator Grant Peer review briefing. So, I'd like to begin to acknowledge the Ngunnawal people traditional owners of the lands on which we are meeting in Canberra. And to acknowledge all of the other traditional lands that you are joining from. I pay my respects to Elders past, present and emerging and acknowledge the Aboriginal and Torres Strait Islander people that are attending the meeting today. Thank you all for the very important roles that you are undertaking and the contributions that you are making to NHMRC’s peer review, and also for taking the time out to listen to our peer review briefing today and contribute to questions. And for those of you who are listening later online, thank you for taking the time as well.

My name is Julie Glover. I'm one of the Executive Directors here at NHMRC, and one of my roles is to manage the Investigator Grants Peer review process, and I'll be facilitating our discussion today. I'm joined here in our Canberra office by our CEO, Professor Anne Kelso, who will be doing the majority of the speaking; speaking to you about your roles as peer reviewers. I'm also very pleased that we've got some very experienced peer reviewers who are joining us here today and some of those who are here in their role as a peer review mentor. Which we will be talking on a little bit about that later and we will advise on who the peer review mentors are for this round.

We also have some community observers joining us, and I wanted to say a special thank you to them and thank them for their really important role in advising us on our peer review processes. Anne is going to present to us the key peer review considerations.

Then we'll have a chance for people to ask questions. So, you can either send your questions through in the chat, in the Zoom chat, or you can also use the hand up function to ask them directly if you would like. If we don't get to all of the questions, please don't be too concerned. We'll make sure that we address them in material that comes out to you after this briefing. And lastly, I just wanted to let you know that this webinar is being recorded and the first part of the webinar will be recorded so that we can put it up online for other peer reviewers to hear who weren't able to attend today.

The questions part of the webinar will also be recorded, but that's just for our own internal use. We didn't want that to constrain you asking your questions. What we'll do with that material is we will summarise the questions that are asked and our responses, as well as adding any supplementary material we think might be helpful and we'll make that available to all peer reviewers after the meeting as well. So right now, I'll hand over to Anne, who will run us through the presentation and then we can have time for questions at the end. Thank you.

CEO peer reviewer briefing

Professor Anne Kelso AO (CEO, NHMRC):

Thanks, Julie, and good morning, everyone, wherever you are around the country. Really appreciate you joining us for this briefing today and particularly for the discussion that will have after I've run through a few slides. I also want to add my thanks for your undertaking of this work. It's a huge task. It's a really important task. We know that it is something that takes a lot of time and a lot of energy. And of course, we're completely dependent on the peer reviewers who help us with this scheme. So, I just do want to express my absolute appreciation for the work that you have agreed to do with us. Now, I'm going to run through a few slides that go through aspects of the scheme and some advice for peer reviewers, and some of this will be quite quick.

But I think there may well be issues that come up during the slides and I hope we'll have plenty of time during the Q&A to address any of those all later. As Julie has just described, I also know that in a group like this we'll have some people who've done a lot of peer review in the past and others who are relatively new to the peer review process for NHMRC. And so, you'll all have different questions and experiences of peer review. But I think it's really great for us to see that you're willing to come and hear where we're at with peer review today and what we're hoping that you'll be able to do with us.

Slide 2 – Overview (4min 25sec):

So, the first slide is just to run through very briefly, so this is just a brief list of the things that we're going to cover in this presentation. And so, you can see it's quite extensive. And so, I'll run through each of these points as we go along. And I'll also try to highlight any changes that have been made in the last couple of years. So, if you haven’t reviewed for us in the last year or so, there may be things here that are completely new to you. And again, I'm very happy to talk about them some more during the Q&A.

Slide 3 – NHMRC’s grant program overview (4min 56sec):

So first of all, really, just as a way of background about NHMRC's grant program, and most people, of course, will be broadly familiar with what we are here to do, but I think it's always useful to start by remembering that the purpose of NHMRC funding is to improve the health of the Australian community. But we take a really broad view of that. That doesn't mean that every grant application has to be immediately about addressing a community or a specific health issue.
We understand that the research we need to support ranges across a very broad spectrum from the most basic discovery research through to clinical research, population health research and improvement in health services. So, we do take a very broad view, but we are always looking for research that in the short, medium, or long term will improve the health of the Australian community. So, our purpose is to fund the highest quality of health and medical research and the best researchers to do that work and to create a very broad base of knowledge and research capability for the country.

So then when we come to the Investigator Grant scheme, you'll be aware this is a very important scheme within that overall grant program. It's a large scheme. It's about 40% of the funding that we disperse each year and it's focused on support for outstanding investigators and their teams.

And as you'll know, it's got five career stages. So, it's seeking to support people from quite soon after the completion of their PhD all the way through to the most senior researchers. We did something very important a few years ago, I think, when we created this scheme, and that was to consolidate the support for the salary and the team into a single package. And so, the grants provide a salary to those who need it.

And of course, there are plenty of people in our community who have a salary from their university or their institutes, so they don't need a salary from NHMRC. But everybody with an Investigator Grant receives a research support package, the size of which depends on their career stage. And by providing a single five-year grant like this as a package, then what we're seeking to do is provide investigators with absolute flexibility to do the best research they can to pursue new and innovative research directions.

Slide 4 – Peer review principles (7min 18sec):

So, when we think about peer review for all of our schemes, we are looking to address the broad peer review principles that are outlined here. The process should be fair for everybody. They should be transparent so that as much as possible is understood about the application process and the peer review process. While of course protecting confidentiality, it should be independent, it should be appropriate and balanced, it should involve community participation. And as Julie has mentioned, we have community observers participating today and overseeing the whole process for NHMRC, and that's an incredibly important part of the process.

Confidentiality, I'll say a little bit more about [later in the presentation]. It should be impartial and of course we're looking for the highest quality and excellence of peer reviews.

Slide 5 – Peer review is important (8min 7sec):

Now, peer review is incredibly important. I mean, obviously all the decisions we make about what to fund depends on the work that you do as peer reviewers. We absolutely depend on the scores that you provide and the feedback that you provide that underpins those scores. So that means for each scheme, it's really critical that the scores are made in reference to the category descriptors for that scheme and the assessment criteria for that scheme. So, look carefully at them for Investigator Grants when you're undertaking peer review for this scheme.

I think we also, all of us who are involved at any level in this process need to remember how much effort applicants put into preparing their applications. I think any of us who's ever prepared a grant application knows how much work this is, and particularly for people who are seeking to find their own salary and their support for the team. This is really a very, very critical piece of work.

So that means as a peer reviewer you have a responsibility to give each assessment the due care and attention that you'd want given to your own application and to provide constructive feedback. And we at NHMRC will always do our best to respect the enormous effort that applicants put into their applications as well as the effort that you as peer reviewers provide. And again, perhaps particularly in the case of this very big and important scheme, we know that the outcomes of the Investigator Grant applications will have a significant impact on most, if not all, of the researchers who receive the grants as a result of the peer review process.

Slide 6 – Disclosure of interests and confidentiality (9min 43sec):

So, peer review is important and again, we're just very grateful for you undertaking this critical task and wanted to say a little bit more about confidentiality and also about disclosure of interests, disclosing interests that you might have relevant to the grant applications that you review is really critically important and don't think that you should have no interests.
I imagine that just about everybody, maybe everybody who's involved in peer review for this scheme, has some kind of connection with NHMRC already, and that means that you will have collaborators in institutions around the country, you'll have friends in institutions around the country. So, it's completely normal that anybody who's involved in peer review will have some interests. And so, what's important is to declare those, if they're in anyway relevant to the applications that you're asked to review.

It may be that you will realise after you've agreed to assess an application that you have a conflict of interest or a potential conflict of interest. And then it's critically important that you get in touch with the Secretariat to advise that, and alternative arrangements can be made if that conflict of interest is too great to allow you to review the grant. So, the main thing is just to be open with us about that so that we can have the fairest possible peer review process for applicants and everybody across the scheme. That's a legal obligation, but obviously it's a matter of fair treatment of applicants as well.

Confidentiality is also a legal obligation under the Privacy Act, and it's also just the right thing to do. We want to respect the privacy of information that's provided through applications and also the confidentiality of people's research ideas and the potential intellectual property that they might be presenting in their applications as well as any other personal information that might be provided. So that's enforced through a document that used to be called a Deed of confidentiality. It's now called NHMRC obligations in relation to confidential information. It's a pity we have to use more words to say the same thing, but it means the same thing.

It is a lifetime commitment. So once the peer review process is over, it doesn't mean you can now talk about what you've read in the applications, though you must basically forget what you've read in those applications and never disclose them. So, treat them with confidence. Don't disclose anything about the applications to people who are not part of the peer review process either now, during peer review or later when you've finished it. And that's obviously really critically important from a legal point of view, but a fair process point of view as well.

Slide 7 – Overview of process (12min 25sec):

So, now I want to just move quickly to the process itself and this summarises what it looks like overall for 2023, basically the same as in the last year or two. So, applications have already been submitted, of course, and the office has undertaken eligibility checks to make sure that all applications are in fact eligible to go through into peer review. And then during this time we ask all peer reviewers to look at a batch of applications which have been selected based on information you've already provided about your expertise. And tell us if you see any which have a conflict of interest for you and tell us whether you are suitable to review that application.

So, that's a really critical process that underpins the fairness of peer review. There's also a process where applications which involve Aboriginal and Torres Strait Islander Health research undergo assessment against the Indigenous research excellence criteria. So that is also done [prior to the assessment phase]. And then you will receive that information. So, applications are allocated to peer reviewers and this year you should be getting between 10 and 30 applications per reviewer. And we appreciate that at the upper end, that's really a very substantial amount of work, but it's very important that each peer reviewer sees a number of applications because this really helps with benchmarking.

If you were only to see 1 or 2 or 3, I think it becomes extremely difficult to benchmark against the overall range of quality of applications in any category [and this is why NHMRC is unlikely to assign applications for review any lower than about 12]. So that's a really critical part of this process that we ask you to look at quite a significant number in any one round. You then undertake those assessments independently and that process will go through till mid-June. And then based on that we will have ranked lists and funding recommendations generated and those recommendations will go to Research Committee and then to Council. [Noting that] Research Committee and Council don't see the names of the applications. They simply oversee the process that we've undertaken [and approve the final figures]. And then those recommendations are submitted to the Minister for Health and Aged Care for approval.
So, all of the dates there [outlined in the slide] are indicative and there can sometimes be changes for special reasons, just as we had in the pandemic year of 2020. And I should have put the second highlight, the second footnote on the mid-June date, not the August 2023 date, in order to be able to get into a better alignment with Ideas Grants in future years.
So, we've got this problem this year in order to get the timelines better for the longer term, we're going to have a second round of Investigator Grant applications opening late this year [September 2023] for funding in 2025, but that just puts extra pressure on the round this year, this one. And so, we really ask you to help us to try to meet those deadlines for everybody's interests.

Slide 8 – Assessment Criteria (15min 22sec):

Now, I want to move on now to the assessment criteria and then into a bit more detail. Now, you’re probably already familiar with this that we have in this scheme, a very strong emphasis on track record [which is] 70% of the score. We'll talk a little bit more about the publication's component of this in a minute because it's changed a little in the last year or so. Research impact is one that I won't talk about in more detail now, but again, it's often something that people want to discuss in questions, so we might come back to that. I'll also say a little bit more about the leadership component because there've been some changes to the guidelines there, and then 30% of the assessment is based on the knowledge gain that will come from the project proposal.

And it's important to remember here that we are providing really flexible funding to the people who receive these grants. So, you're assessing the overall quality and significance of the knowledge that will be gained from the proposal. But if you happen to see that there's overlap with other funding that this individual already has, don't worry about that. Just deal with this as a description of their research program that the Investigator Grant, if successful, would provide assistance to.

Slide 9 – Publications (16min 38sec):

What we are now doing across schemes is focusing [on the] assessment of the publications track record on the individual’s Top 10 publications in the last 10 years. Now this is up to the top 10. So, if somebody has fewer than 10, that's fine, but it's to focus on the quality rather than the quantity of [publications]. I'll say a little bit more about that. You might know now that preprints are allowable publications for this purpose. That was a change made a couple of years ago.

We've introduced a revised Relative to Opportunity Policy and Career context section that I'll say a little bit more about [shortly]. And it's just really important when you consider any aspect of track record that you take into account the opportunities the applicant has, and the Career context section gives a lot of information about that. We also clarified the statement of expectations. That is the justification for why somebody has applied for EL1, EL2 or Leadership [level] 1, 2 or 3. Those have been clarified and that continues in this year's round.

Slide 10 – Publications in track record assessment (17min 45sec):

So just to say a little bit more then about publications. So, as I said, we now are not providing the whole list of publications for you to review, and we ask you not to go hunting in PubMed or websites or wherever to look at the long list. What we really want you to focus on in the scoring of top publications is the Top 10 that the investigator has selected to bring to your attention. And so that's the up to 10 in the past 10 years taking into account career disruptions. So, as I said, the purpose here is really, in fact across all of our schemes, to increase the focus on the quality of the work that people have done rather than quantity.

We don't want to have a system that drives people to salami slice their research into lots of minor publications or peer reviewers to be presented with a list of 300 publications which they're then expected to comment on for quality. So, by asking the applicant and the assessor to focus on their best 10, or up to 10 if they choose to provide fewer, this is really so you can focus on looking at whether those publications have the quality that you'd expect and are making a genuine and significant contribution to science. So, it's always a question then what do we mean about the quality of published research? And that's really important.

This is really about the rigor of the research and how it's being conducted, the design of the research, not the standing of the journal. There isn't a shortcut here that says because it's being published in Nature, it must be good.

Think about what was the publication’s contribution to science? And what was the applicant's contribution to the publication? And based on all of that, make a holistic assessment of the quality of the publication record as provided by the applicant, who will choose those publications for a number of different reasons. Sometimes it might be because they led the research, sometimes it might be because they made a critical contribution as a secondary author to a much larger piece of work.

Slide 11 – Publication category descriptors (19min 47sec):

So, think broadly about what their contribution to science is through the publication record, as I think this is basically repeating much of what I've just said. It's 35% of the score. Use your best judgment in differentiating between the category descriptors. We just provide you with a set of words but think on an international scale about what you're looking at. Benchmark against the opportunity of the applicant to undertake their work in the 10 years or 10 years adjusted for career disruption.
Make sure that the publications are within the 10-year timeframe. But taking into account career disruptions where they've been declared [and how this shifts the 10-year period], disregard reference to the quantity of publications or journal metrics like this is a high impact journal or this is my H-index. So, we really want you to focus on the quality of the science itself. We know that's real work and I hope that it's made much easier by being up to 10 publications, not up to how many somebody might have published in the last 10 years.

Slide 12 – Relative to Opportunity (20min 53sec):

So relative to opportunity, this is always a very important topic. The policy was revised in 2021 to clarify the range of relevant circumstances. The Career Context section was introduced in 2021 so that reviewers can really understand what the relative to opportunity assessment is for this individual [and their career]. And a really important change in our view is that now every applicant is required to complete a Career Context summary. So, taking that into account for every applicant that you're looking at and describing their own individual circumstances and the opportunities they've had to undertake research. Disruption of research due to COVID-19 is a really important relative to opportunity consideration that will be important to many people.

Slide 13 – Statements of Expectations (21min 38sec):

Statement of expectations is provided to the applicants to help them choose which level to apply at, and they must then justify their selected level. So, when you come to look at this [level] and look at their applicant justification, you need to consider that relative to opportunity and consider whether they have in fact applied at the right level, not below or higher than the level that is appropriate for the point in their research career and their research profile.

Now, the statement of expectations does provide guidance on the typical number of years post PhD at each level and the typical academic level for each level of the Investigator Grant scheme. But these are not absolute and they're not [part of the] eligibility criteria. It's really important for the applicant and for you as an assessor to look at the descriptors as well.
The typical years post-PhD and academic levels are just guides, but we know when we look at applicants across our schemes, there’s huge diversity and career trajectory and the types of opportunities that people have had. So, we really want you to consider those descriptors and think about the person in that context. They're a guide. They're not to be used for eligibility, but they will help you in deciding whether somebody has justified appropriately the level at which they've applied.

Slide 14 – Updated Leadership criterion (22min 58sec):

Now the other part of the process [and] criteria that’s been updated recently is the leadership criteria. As I mentioned before, and what we wanted to do here was to really recognise a broader range of leadership contributions that people make to create the sort of environment where research can be excellent and where it can be appropriately catered for. So, this is to get away from thinking that leadership is only about heading an institute, being a really senior person, or having a major job. It’s leadership at every level of the research career, and it includes things like fostering collaboration, being a good mentor, their leadership contributions that are critical for the effective performance of our research sector across every stage of career. And so, we've attempted to revise the wording to really encourage both the applicant and the assessor to think very broadly about what leadership is.

So, there are now four leadership elements: research mentoring, research policy and professional leadership, institutional leadership and research programs and team leadership. Applicants can present an example of their leadership against these elements, but they don't have to present examples against all of those elements. They're simply making the best case that they can to indicate their contribution as a leader at their level in in their field and in their institution. So, I think it's good to think very broadly about what leadership is and recognise all those different types of leadership.

Slide 15 – Peer reviewers need up-to-date fields of research listed in Sapphire for grant matching (24min 39sec):

Now, another critical thing at the moment and always for peer reviewers is to update your [research expertise] fields in Sapphire. They [categories and words] get changed from time to time and we have continued to improve our ability to match assessors to grant applications [as a result] and that absolutely depends on the information you provide.

First of all, in indicating your research expertise in Sapphire, and then through the suitability and conflicts of interest process. So, updating your research expertise is a really important part of this process. Indicating your broad research area is another really useful guide for us. And it's important to note that we've recently introduced a description of what we mean by the broad research areas. We've been using a broad research area as a category descriptor, as a description of what we fund for many, many decades. It's our longest lasting piece of data, but we haven't until now indicated what we actually mean by those broad research areas. So please look at the descriptions and choose the ones that best fits your expertise or if you're an applicant, your application.

So [it’s] really helpful for us if you keep all of the information up to date in Sapphire; that allows us to not waste your time asking you to look at applications that aren’t relevant and improves our matching all the way along.

Slide 16 – Resources for peer reviewers (26min):

So now I want to turn now to some of the resources that are available and to talk about some tips for peer reviewers. Really important to read the Peer review guidelines so you understand the processes and responsibilities. We're just running through them very briefly today. Review the aims of the scheme; the assessment criteria and the category descriptors; and if necessary, just keep referring back to those as you go through your 10 to 30 applications that you're reviewing. We hope that the Peer review mentors, mentor video and the support pack will be useful to you. Please review them, and do so again if you need to, throughout the process.

We ask that everybody, even if you've done this in previous years, each year to undertake the online Implicit association test, which tests us for gender and science bias, and also review the Royal Society video on understanding unconscious bias. I think we all have biases and the test, and the video just really help to remind us how these things can play into our thinking and help us. I think, to be aware of them and try to counteract them as we go through our work and then, of course, seek advice from Peer review mentors whenever you need to. And I'll come back to that in a minute.

Slide 17 – Tips for peer reviewers – Assessment (27min 18sec):

So, some tips for assessment. The number one biggest, most important tip is to start reviewing early. We're asking you to undertake [peer] review of quite a pile of applications over roughly a four-week period, and you really need to plan your time in order to be able to do that work, not leaving it all till the last minute when something could go wrong and you're not able to complete it. Spreading out the load so that it's not too stressful or exhausting at any one time. So really try to plan your time to the extent possible and I think that will help the process.

Many people find it useful to group the applications that they've received by level and assess them in a group and that can help with benchmarking. So whatever works for you, but many people find that helpful. We really ask that you complete your assessments within Sapphire and one of the reasons to do that is that it reduces the error rate of people uploading the wrong set of scores in for any one application. So, it unfortunately tends to happen each year.
If people are working offline that they then incorrectly upload their scores against the applications and then that creates problems down the track of course. So, if you can complete your assessments in Sapphire that avoids that potential error and Sapphire will keep saving as you go along. But if for whatever reason you need to work offline, there is an Excel template provided.

You'll still then of course need to upload those data [scores] manually into Sapphire so it's an extra stage of work later. But if you do decide you're going to do that work offline, would you please let Secretariat know.

Because they're going to be watching to see whether you've started your work and how you're progressing through it. And if they don't see any progress, you may be working away in the background, but they don't see any progress and they will be getting very worried and will start hassling you. So please let them know. Also, let your secretariat know if anything comes up where you have a problem or you need clarification or help. But bottom line, say it again. Start reviewing early.

Slide 18 – Applicant feedback (29min 27sec):

So, the next thing is applicant feedback. And I think applicants really appreciate the feedback that you provide on each application and it's also a quality control process for us. So, I'll explain why. Now you're asked when you undertake peer review to provide scores, but you must also provide qualitative feedback. This is really critical. That feedback will go to the applicants exactly as written and it'll also now go to the fellow peer reviewers on that application. So, if you're one of 5 peer reviewers for an Investigator Grant application, once you've submitted your review and all the reviewers' comments are in, then they will be shared amongst the 5 peer reviewers for that application.

So, it's important to remember particularly what it will be like for the applicant when they read your report. It needs to be explained why you've scored in the way you have, comment on the strengths and weaknesses [of the application] so that they can understand why you scored [in the way you have]. Make sure that there's a really good alignment there because otherwise people will be worried [there was a mistake] and won’t understand why you scored [in the way you have].

Write clearly, try to be really specific in what you say. Make sure those comments and criticisms are constructive [and] are going to be helpful because that's one of the really useful parts of this feedback. Always keep the tone professional and objective. Remembering that there's a human being at the other end.

And there’s some more advice on preparing feedback in the Peer review support pack. NHMRC office will check through all comments for inappropriate content but remember that the Secretariat staff won't be able to tell whether your comments are appropriate for the science.

They'll be looking for inappropriate content at a high level, but not for whether you actually put comments that are appropriate for that application. So, nobody's going to be able to check for that. But your fellow peer reviewers will see your comments and that has turned out to be very useful for us. Occasionally, when they've come back and said to us, we think those comments didn't apply to this application. And so that's one of the ways in which the sharing of reviewer comments is a quality control step. But most importantly, I think it means that you can learn and see how other reviewers have looked at the same application, and that's a useful process we understand for peer reviewers themselves.

Slide 19 – Application-centric matching of peer reviewers (32 min):

We’re using up the time fairly fast. And so, I just want to run quickly through an explanation of what we now call application-centric peer review. And this is the way we match peer reviewers to applications. And you'll know if you've been reviewing with NHMRC for a long time that we used to run Grant review panels (GRP), for all of our schemes. And so, what we would do then was try to find a set of applications that then needed to be matched to a set of people on a GRP and that's what we call panel-centric matching. And if you're going to have a GRP meeting where grants are going to be discussed, then of course that's how you have to organise the matching of peer reviewers to applications.

But it is much, much harder to get the matching of expertise of reviewers to applications and to avoid conflicts of interest when you're setting up GRPs for a very big scheme like Investigator Grants or in fact, Ideas Grants. So, in Investigator Grants and Ideas Grants, we now take a different approach, which is effectively to make a specific panel for each application. So that's why we call it application centric. We're now looking for the most appropriate 5 reviewers for each application where there will be the best matching of expertise without conflicts of interest. The part on the right-hand side of the screen might be visible on your computer screen. I won't read it now, but it might be visible to you now or later.

Slide 20 – Application-centric suitability of peer reviewers (33min 26sec):

Just want to show you 2 or 3 slides which look at how application-centric matching has improved the optimisation of suitability of reviewers to applications. And in 2019 and 2020, we used a different process, more aligned with GRP-matching [panel-centric matching], even though we didn't hold GRPs in 2020 because of the pandemic, but in 2021 and 2022 and now in 2023 we've got a full application-centric matching process which does not rely on the formation of GRPs. And what you can see, I hope, from this slide is that when people have been asked [about suitability, they tend] to say whether they had no suitability, limited suitability, moderate or yes suitability against the applications that were presented to them finally for review, we've had a very much higher moderate and yes suitability than we did in 2021 and 2022. And of course, we're hoping 2023 as well, then we did in earlier years. So, we've now got in 2022, 86% of people were given applications to review for which they said they were suitable.

Slide 21 – Application-centric – reviewer responses (34min 34sec):

Then when we ask in the Peer review survey after the process was finished, whether people agreed that in general the applications assigned to them match their area of expertise, you can see at the top of that bar graph that 91% of people agreed or strongly agreed with that statement in 2022, compared with only 48% back in 2019 when we used a GRP-based process. So, the matching has been getting better and better as we've gone to a more application-centric process and as the matching itself has been refined in the work that the office does based on the information you provide.

Slide 22 – Application-centric – reduced assessment burden (35min 13sec):

The other thing we're really pleased to see is that it's been possible to achieve a reduced assessment burden for peer reviews as the years have gone on. And so, in 2022 we only needed to provide an average of 18 applications to people reviewing Leadership applications and 15 to people reviewing Emerging Leadership compared with a much larger number in earlier years. So, we'll continue to work on how to get an appropriate balance between giving you enough applications to review that you'll get good benchmarking, but not so many, that it is a hugely burdensome task because we do recognise that for many people it is a really, very big piece of work.

Slide 23 – Peer review mentors (37min 28sec):

Now we're really delighted that we have a number of people who are willing to be Peer review mentors for us this year, and that means they're going to be available to you via the Secretariat to provide advice on how to do peer review. And so there are six shown on the screen here, and that's several more than we had last year. Some people have come back, but we've really delighted to have some additional people who are helping us with this process this year.

Patsy Yates is at QUT (Queensland University of Technology). She's a distinguished professor and she's Executive Dean of the Faculty of Health, QUT. Patsy is a registered nurse with extensive experience as a leader in education and research in the health sector. She's a long-standing committee member at NHMRC. Before her appointment as Executive Dean, she was Professor and Head of the School of Nursing at QUT, and she's also co-director of QUT Centre for Health Care Transformation. So very experienced with health services research and clinical research.

Professor Sarah Russell is at the Swinburne University of Technology and the Peter MacCallum Cancer Centre. Her interest is understanding how immune cell development occurs and how polarity and tissue organisation influences cell fate. And I have known Sarah for many years and know of her very strong interest in T cells, how they develop, how they respond to pathogens and cancer, how errors in their development can lead to leukemia. So, Sarah has very different expertise then from Patsy, for example.
We're really delighted to welcome Professor Mark Nicol as a new Peer review mentor. He is a medical microbiologist in the School of Biomedical Sciences at the University of Western Australia. [He] also holds an honorary appointment at the University of Cape Town, in South Africa. And his passion is using modern molecular tools to understand complex microbial communities. So again, a very different expertise.

Stuart Tangye is from the Garvan Institute of Medical Research, another passionate immunologist, in this case human immunology, cell biology and immune deficiencies. Stuart has been at the Garvan since 2006 and he holds an Investigator Grant and in fact he was the top-ranked Leadership applicant in the first round in 2019. So, he knows how to write a good Investigator Grant application.

Professor Jennifer Stow is at the University of Queensland. She's a molecular cell biologist and head of the Protein Trafficking and Inflammation Research Laboratory, the laboratory at the Institute of Molecular Bioscience.

And finally, Dr. Beth Allison is from the Hudson Institute of Medical Research. She is a vascular physiologist, with a strong interest in the developmental programming of health and disease. Beth also holds an Investigator Grant and works with the Neurodevelopment and Neuroprotection and the Prenatal Transition Research Groups in the Ritchie Centre.
So, these are 6 really terrific people, very experienced grant writers, grant holders and peer reviewers. So, they really know the system very well and have contributed to NHMRC in many ways over the years and to other international funding systems. So, they are available for you to seek advice from, not on the science of the applications that you're reviewing, but on broad peer review questions about how to approach issues. If you want to seek advice from a peer review mentor, please do that through the Secretariat and they'll put you in contact as appropriate.

Slide 24 – Gender equity (39min 29sec):

And so, then I just want to say very briefly what you probably already know and that is we have a new gender equity initiative in this year's Investigator Grant scheme and for the years ahead. So, we've talked about that a lot in the last year and I'm not going to go over all the background to it. I hope you're already familiar with it, but there are some important aspects of these changes.

First of all, something that we won't change is that we will continue to use structural priority funding, which is near-miss funding for women at the Emerging Leadership levels 1 and 2, to the extent necessary to achieve our gender equity targets, which is equal-funded rates for men and women applying at those two levels. For the Leadership category, the new initiative, and this is for L1, L2 and L3 combined, is to target awarding equal numbers of grants by gender across the whole of the Leadership category.

Now that's subject to there being sufficient grants from both genders that reach a certain threshold score. So, there's no reduction of quality there, but it is a target to achieve equal numbers of grants to men and women. And then critically important also this year is that non- binary researchers who declare that they're non-binary in their application will be included alongside women in both of those gender equity interventions, the structural priority funding, and the target of equal numbers of grants by gender.

So those are important changes. But the really critical thing to say to you today is that this is not relevant to the peer review process that you're going to undertake. We ask you simply to review the applications as they are before you. Don't consider how structural priority funding or targets might be applied post peer review. Simply do a fair and considered review of all applications, and the gender issues will be considered by NHMRC after the peer review process. So that's really central I think, to the integrity of peer review that you review what's in front of you and take into account the relative to opportunity and Career context information that has been provided.

So now we have some time, I hope, for questions and answers, I'll hand back to Julie to lead that part of the program.

Thank you to webinar participants

Dr Julie Glover: Thank you very much, and thanks for the great discussion. And if there's anything else that comes up that you think about after this, or as you're going through your peer review processes, please don't hesitate to reach out to your secretariat. You'll have a particular secretariat nominated to you. And also, as Anne mentioned, we do have the mentors available, so we will have drop-in sessions for those mentors, and they will be very useful I think as things come up as you are doing your peer review. So, thanks very much everyone for your contributions today. And thanks to Anne for the presentation and the team for putting everything together. Thanks all and good luck with the next steps in the process.

Outcomes of funding rounds

Before you submit an application, please make sure you have:


Grant submission advice

Read all relevant reference material

GrantConnect GO Documents


Liaise with your Administering Institution

to identify any specific requirements that the institution may have


Ensure your application is complete and correct.


File type