This Ideas Grants 2022 Peer reviewer briefing webinar informs peer reviewers of the processes and requirements for peer review of Ideas Grant applications, providing helpful information and tips for reviewers in completing their assessments.

Amanda Lawrence: So, I'd firstly like to start off with an Acknowledgment of Country.  So, I would like to acknowledge the Ngunnawal people, who are the traditional custodians of the land on which we are joining you here from here today in Canberra, as well as acknowledge the traditional custodians of the lands from where each of you are joining us today right across the country. I'd like to pay my respects to their Elders past, present and emerging and also acknowledge our Aboriginal and Torres Strait Islander colleagues who are joining us here today.

By way of introduction, so I'm Amanda Lawrence.  I am the Director of the Ideas Grants section here at NHMRC.  I also have with me our CEO, Professor Anne Kelso, who will be delivering today's presentation, pending her voice, and we also have some Peer Review Mentors joining us for today's session and I'll introduce them and explain the process with them very shortly. But just wanted to let you know who each of us are.  I'd also like to very much thank you for your time today, for joining us.  We've had a really positive uptake of today's webinar, which has been fantastic, so we have lots of people on the line with us today and also, of course, thank you very much for your involvement in Peer Review for Ideas Grants.  The reality is we cannot award funding through this scheme without your involvement as Peer Reviewers.  We acknowledge that it is a big undertaking in terms of your own time and effort and we truly do appreciate your dedication in helping us deliver this scheme and for your involvement there.

So, how today's session will run: so, we will have a presentation.  I will share my screen shortly and Anne will run through some basics of the scheme, what we're requiring of you as Peer Reviewers in terms of the assessment criteria, and then we will have an opportunity for our Peer Review Mentors to introduce themselves. 

So, we have three Peer Review Mentors assisting us this round, and the Peer Review Mentors are those researchers with extensive Peer Review experience who are here to offer some additional support to you as Peer Reviewers throughout the process. So our Peer Review Mentors we have for this round are Professor Rosalie Viney, Professor Nicholas Talley and also Professor Yvette Roe.  Unfortunately, Yvette is not able to join us for today, but you will have a chance to hear from Rosalie and Nicholas a little bit later on. Once we've done the presentation and we've heard from our Peer Review Mentors, we'll have an opportunity for you to ask questions, either of myself, of Anne, of our Peer Review Mentors, and when we get to that part of the presentation, I'll explain exactly how we're going to facilitate that, particularly with so many people on the line.

Just so you know as well, today's session is being recorded, so we will let you know when that is available.  We do plan to publish it on our website, particularly for the benefit of those who were not able to join us today.  But we will be recording today's session. So, without further ado, I will hand over to Anne.  Just bear with me and we will get the presentation up.


Prof. Anne Kelso:  Well, while we're getting the presentation up, Hi Everyone, and I do apologise for my post‑COVID croak.  I hope I'm going to be able to get through this presentation but I'm very pleased to have Amanda here, who will step in if for any reason I can't speak.

I also want to thank everybody for your participation in Peer Review for Ideas Grants this year.  Thank you.  I know it's an enormous task: it's an incredibly important one.  We really appreciate your participation, and we also appreciate the fact that so many of you have come along to this session.  I hope we'll be able to give you some clarity and answer questions when it comes to Q&A.

SLIDE 1: Title Page

Now, I think we'll just barrel through this presentation and hold all the questions till the end, and that allows us just to do a straightforward recording of this part of the discussion.

SLIDE 2: Overview

So, to get on with it, first of all, broadly the scheme ‑ we're going to cover these issues in this presentation ‑ the objectives, the principles, go into some detail about the Peer Review process and assessment, and then end up with some tips from past experience and then some discussion with the Peer Review Mentors, as Amanda has already mentioned, followed by an open discussion.

SLIDE 3: NHMRC’s grant program

So, first of all, I think everybody probably understands that NHMRC’s overall grant program is intended to support our mission of building a healthy Australia, and I think it’s always useful to remember that that’s what our funding is for.  So, all of our funding should be supporting research that ultimately, in the short or the medium or the long term, can be imagined to increase the health of the human population.

We also seek to fund the highest quality of health and medical research and the best researchers create knowledge and build research capability.  So broadly that's what the grant program intends to do, and then Ideas Grants sit within that with some specific goals that make it stand out from all of the other schemes.

So, first of all, there's a very clear emphasis on supporting innovative research projects to assess a specific question, and we also hope that this scheme will provide particular opportunities for early and mid‑career researchers so that we can fund researchers at all career stages through this scheme.  And that means, of course, those early and mid‑career researchers have to be able to demonstrate their capability to deliver the project but we're not looking for people to have long and deep track records here, and quite specifically we don't have a track record element in the assessment criteria, but we'll talk more about that when we get to that point a bit later.

SLIDE 4: Peer review principles

We have a set of principles that we have underpinning Peer Review for all NHMRC grant schemes and I guess, as you start this process, it's worth reflecting on these Peer Review principles and how we ensure that all of us achieve them in our different ways ‑ fairness, transparency, independence, for NHMRC it's important that we try to find the most appropriate Peer Reviewers for each application and also have a balance of skills and diversity of Peer Reviewers across the country who contribute to that appropriateness and balance.  We need participation from the research community.  As you know, if you receive NHMRC funding, we expect you will contribute to the process overall by undertaking Peer Review as well and we know a great many people step up to that extremely well.  Confidentiality and impartiality are crucial, and we will say a bit more about that, and finally, of course, quality and excellence, which is the goal of all of the research that we fund.

SLIDE 5: Disclosure of interests and confidentiality

So, just to say a bit more about impartiality and confidentiality, it's perfectly normal for Peer Reviewers to have interests that are relevant to NHMRC. So, when we're seeking to avoid conflicts of interest, we're not expecting that people will have no interests at all that are relevant to NHMRC funding.  It's just important to be aware of them and to declare them if they're relevant to the grants that you'll be reviewing. It isn't always obvious in advance that you're going to have a conflict of interest.  They can become apparent at any stage during Peer Review, and so we ask you ‑ and, in fact, it's a legal obligation ‑ that you disclose conflicts of interest as soon as they arise.  So, if you suddenly find when reviewing an application that, in fact, it involves something or a person with whom you have perception or actual conflict of interest, it is really important to get in touch with the secretariat to discuss that.


for confidentiality, this is obviously crucial for everything we do.  It's a legal obligation for you as a Peer Reviewer under the Privacy Act because you're going to be seeing some personal information, as well as seeing confidential research plans.  It's enforced through the NHMRC Deed of Confidentiality, which you will have agreed to.  It is important to remember that this is an ongoing commitment.  It's forever.  There isn't some point in the future when it's OK to talk about the applications that you've reviewed.  It is a lifetime commitment and so you'll need to treat the applications that you see in confidence, not disclose anything about them to people who are outside of the process and that's true now and forever more.

SLIDE 6: Peer review is important

So then, as we come on to what we're actually asking you to do, it's really important that you know that we think Peer Review is incredibly important.  We absolutely rely on you and the scores that you're going to provide us for the funding recommendations that we will then make to the Minister for Health and Aged Care.  We ask that the scores you provide are made in reference to the category descriptors against the assessment criteria for this scheme.  That's essential if the research that's funded through Ideas Grants is going to meet the goals of that scheme.

The other thing that I imagine you're all very aware of ‑ because I know many people are applicants to NHMRC grants schemes ‑ that applicants dedicate a huge amount of time and effort to preparing their applications.  We know it can be extremely difficult.  It's demanding  and a lot of tension can go into the preparation of applications, particularly at a time when funded rates are unfortunately very low.  So, Peer Reviewers have a responsibility to give each assessment due care and attention, and to think about the fact that there's a person or a team of people at the other end who've put the effort into their application no matter what gaps or faults you might find in it.

It is very important to provide constructive feedback and to think about the people who are going to be on the receiving end of that feedback so that it's going to be useful for them in the future and not only distressing if it turns out that they don't get funded.

We also know that the outcomes of funding through this scheme, as through all of our schemes, can have a very significant impact on an individual researcher's career and the support for their team, so we respect that as well and we ask you to think about that in undertaking fair Peer Review.

Finally, every review counts.  The fact that there should be four other people reviewing each application doesn't mean that your review isn't important.  Every single review counts and each person's scores of an application will have an effect on the outcome of the review.

SLIDE 7: Ideas Grants 2022 peer review: overview of process

Now, I just want to run through the process and then just talk in a bit more detail about some aspects of what's on this slide.  So first of all, applications were submitted back in May.  The office undertook eligibility checks.  We then went out to you to see your conflicts of interest and suitability to review sets of grants, and again I just thank you so much for the very major effort that that is.  We know it's a lot of work.  It's a critical part of the process to try to get the best matching we can of Peer Reviewers to each application, but I will say a little bit more about that in a minute.

We also know that the process this year was a little bit different from last year.  The disclosure of conflicts of interests and suitability was based on fields of research ‑ the new ABS codes ‑ along with Peer Reviewer areas and that did mean that sometimes people came back to us and said that we didn't have applications there that were suitable for you, and so we then asked you to look at longer lists.  So, for those who did need to look at longer lists again, I just thank you for undertaking that very substantial effort.


In late May then, the assessments against the Indigenous Research Excellence Criteria were undertaken and I'll say a little bit more about that later.  Then, finally, applications have been allocated to you as Peer Reviewers and we've aimed to have about 20 applications per Peer Reviewer and for each application to have five Peer Reviewers, and then now you'll be undertaking those assessments independently.

Those scores will be the basis ‑ the scores that you provide us will be the basis for us generating a ranked list of applications, determining where we can go down that list with the funding available to produce the list that will be recommended to the Minister for funding.  And then in due course, when the Minister is approved, then funding will be announced. We hope under embargo ‑ that's always at the gift of the Minister ‑ and then ultimately a public announcement.  Excuse me.

SLIDE 8:  Application-centric matching of peer reviewers

So, I want to say a little bit more about the way we're selecting Peer Reviewers, and many people will be aware ‑ maybe you're all aware ‑ we've changed significantly how we've undertaken that process in the last few years to a process that we call application‑centric matching of Peer Reviewers in place of the traditional panel‑centric form of matching to generate Grant Review Panels, GRPs.

Now, the GRP method of matching means you have to match a large group of applications to a group of assessors who will form the panel.  There may be 15 members of a panel, and with the constraints of matching a whole set of applications to a set of people, it's very much harder to get a high‑quality match across all of the applications and it's also much harder to avoid conflicts of interest.  So, some years ago we started developing an approach which would seek the most appropriate five reviewers for each application, rather than trying to form panels, and we use this term “application‑centric matching”.  What that can mean is that every application has a unique set of Peer Reviewers.  It may not always happen but that is what is tending to happen.  And what from our point of view, what is really magic about this process is that it gives us so much greater flexibility to match the expertise of the Peer Reviewers to the content of the application and to avoid conflicts of interest.

So how can we assess that? Well, there are various ways we can assess how well the matching is going, and I'll just show one piece of data

SLIDE 9: Application-centric matching – response of reviewers

which is the response of Peer Reviewers to the question in our survey: in general, the applications assigned to me matched my area of expertise.  So back in 2019 for Ideas Grants, we had a panel‑matching process and we had just over half of people said that in general the applications assigned to them matched their area of expertise.  But we went to a form of application‑centric matching in 2020 and you can see the positive responses to that question jumped to almost 75% and, again, in 2021.  Now, I don't know what the data will be like for this year, but I think this is very encouraging, that we're getting much better matching from the point of view of Peer Reviewers as well as by the criteria that we can see internally.

SLIDE 10: Peer reviewing Ideas grant applications


So then just to go on to the actual process, now, I have said already a reviewer is assigned approximately 20 applications to assess.  That is what we aim for.  Some people might end up having a few more, because if other people drop out at the last minute, we may need to ask you to do a couple more.  We hope not to have to do that.  Sometimes it's necessary in order to get the adequate expertise.  And every now and then, somebody will receive significantly fewer than 20 applications because their expertise is relatively niche, but we're really aiming to have much the same number going to all Peer Reviewers.  And then that really gives you a way of benchmarking.  If you have too few, it's harder to get a sense of the spectrum of quality.  If you have too many, of course, we're putting a very great load on you, on top of what we know is a big load already.

So, what we’re then asking you to do as Peer Reviewers is to provide scores and comments against the four assessment criteria for the scheme, and as I’ve already said, it’s really important to look at those assessment criteria and understand what they’re trying to achieve.  We’re also asking you to review budgets and provide specific comments to support any recommendations for changes if you think that they are required.  And then finally for the first time this year, we’re going to be sharing comments between reviewers for a particular application, so if there are four other reviewers of a grant that you’re reviewing, then you’ll have a chance to see their comments and they will have a chance to see yours.

SLIDE 11: Viewing assessor comments (new in 2022)

So that’s something new for this year.  It’s a sharing of comments, not scores.  It won’t happen until after the assessments are closed.  It will, I hope, be very useful for people to understand how your comments have compared with others.  It’s not to say that your comments are better or worse than any others.  It will simply, I think, show a spectrum of views and probably be educational in understanding how other people have reacted to the same application.

It's really important to understand that this is not a’ opportunity then to go back and say, “Well, I want to rescore the application now that I’ve seen those comments”.  However, if you or we through that process identify that there’s been an error – for example, the comments don’t match the application – then absolutely we want to know that and we want to correct that error.  So, it's not for rescoring but it will be useful if any errors are identified, as can happen from time to time.  But really important to remember that when you're writing your comments, they're going to be seen by the applicants, who are your real audience, but they'll also be seen by the other Peer Reviewers as well.  We hope this is going to be a step in the right direction.  It adds to accountability, it adds to education and training, about Peer Review.  We'll be very interested in your feedback at the end of the Peer Review process on whether you found this useful.

SLIDE 12: Ideas grant assessment criteria


So, the assessment criteria for Ideas Grants are Research Quality, Innovation and Creativity, Significance, and Capability, and I'm just going to say a little bit about each of those now. 

SLIDE 13: Research Quality

So, first of all, research quality is asking you particularly to evaluate the quality of the hypothesis and the rationale in the research plan, the quality of the study design and the approach, but also how competitive you think this is with equivalent proposals internationally in your experience and also how well you think the applicants have outlined the risks that they see, scientific and technical risks in their project, and how they propose to manage them.

Now, it's a really important part of this scheme that there should be risk.  It's very hard to be innovative and creative without risk, and we really want this to be a scheme where people will take on risk in their proposals, but they should be aware of what those risks are, and they should be able to describe them and say how they'll manage them as described.  So, we're asking you to have a look at that.

The other thing about this scheme is it very specifically does not ask for preliminary results and pilot studies.  We're not looking for projects where everything has already been done.  We're really looking for projects that will push the boundaries, and that may mean that there's very limited preliminary data or pilot studies to support it.  So please remember that that should not be expected and would not be a reason to score down an application.

SLIDE 14: Innovation and Creativity

So, then the second criterion is innovation and creativity, and we know that this can be a difficult one to come to grips with, but the scheme is very specifically trying to challenge and shift current paradigms or have a major impact on an area in a creative way.  So, research might be incremental but should nevertheless be really pushing the boundaries and not just be obvious.  So, expect to see a point of difference from current concepts, approaches, methodologies, etcetera, and think about relative to the research field of the project to what extent does the planned research demonstrate that the project aims are innovative, that the outcomes will shift the current paradigm if successful, and that it could lead to a breakthrough or real impact in the research area.  So, this is really asking you to draw on your own experience to think about what would be boundary‑pushing in the field.

SLIDE 15: Significance

The third criterion is significance, and this is the extent to which the outcomes and outputs will result in advancements to the research or health area.  Now, it's really important that this is not seen as how ‘big is the problem?’  If it's a project, for example, that's about diabetes, the fact that very many people suffer in diabetes does not automatically make that project significant.  What would make it significant is if the outcomes of the research would actually make a difference.  So, it's the issue of the importance to advance the area or the health problem, not the scale of the problem that's being addressed.

You're asking whether the planned research will result in outcomes, in the science, the knowledge, the practice or the policy underpinning human health issues and whether it will lead to research outputs of any kind, of which there are many that are listed there.  So, think broadly about significance but don't think of it just as being whether lots of people suffer from the disease which might be the subject of study.

SLIDE 16: Capability


The fourth and final criterion ‑ sorry, I skipped a slide there ‑ is capability, and this is really important.  This is not track record.  This is the capability of the applicant team to deliver the project.  It's not the same as whether they have a long list of publications or whether they've been invited to give plenary presentations at international meetings or all of those other sorts of things that can be a traditional part of track record assessment.  This is about the capability to deliver the project as described.

So, first of all, the CIA ‑ does that person demonstrate the capability to lead the team in order to achieve the aims? And, as we've said, we want this to be an opportunity for early and mid‑career researchers, provided that they've got some ability to demonstrate their capability to deliver the project ‑ not the same as a deep track record.  And then does the CI team as a whole have the capability to execute the project and deliver the outcomes? Do they have access to all the resources they need or the additional personnel, for example as Associate Investigators, that would enable them to deliver? And do they have that overall balance of expertise and experience and training across all the different aspects of the research in order to be able to deliver it?

SLIDE 17: Refer to Category Descriptors

So then remember to refer to the category descriptors ‑ I've said it a couple of times already ‑ but they are your guide and they're a description of what would be a best‑fit outcome.  They provide you with some benchmarks.  But it is important that a project doesn't have to mean every single descriptor at a given score, but those descriptors do give you an idea of where to rank your scoring of the application and it's really a way then to standardise, as far as you can, across the 20 or so applications you're looking at, your distribution of scores against our scoring grid.

I know people argue about the difference between exceptional versus outstanding versus excellence, but think of them on a scale, look at the actual words that are in the descriptors and think about how you can use them consistently to describe each application that you're reviewing.

SLIDE 18: Indigenous Research Excellence Criteria

The Indigenous Research Excellence Criteria (IREC) I mentioned earlier.  This is specifically for applications that are identified as relating to Aboriginal and Torres Strait Islander Health and Research, and you will need to take into account the report on those criteria provided by an IREC assessor in their assessment of the application.  You may not receive any applications for which this is important, but if you do, it is really critical that you take that IREC report into account when you arrive at your scores.

There's more advice on this in Appendix F of the Peer Review Guidelines and the Peer Review Mentor Video also provides some advice on assessing these criteria, and you can find that on our website.  Excuse me.

SLIDE 19: Budget review (‘by exception only’)

Now, the budget review is by exception only ‑ it means that you don't need to comment on this if you think that the budget is fair and reasonable, but we do ask all reviewers to look at the budget and consider whether the requested items are necessary and justified and appropriate value for money.  The budget requests needs to align with the Direct Research Costs guidelines, and if you, when you look at the budget, think some of it is not reasonable, then please provide us with very specific comments ‑ which section to reduce, by how much and why, so that in our internal review, we can then understand what the recommendation is and make a decision on whether to make a budget cut to the grant.  And that might mean that you think a PSP2 request is not justified, that the work described would be more than adequately done by a PSP1, in which case say that explicitly, that the PSP2 request is not justified, a PSP1 would be appropriate, and the budget should therefore be reduced in that way.  So be as explicit as that so that our team is not then second‑guessing what your intention is if you think that the budget is a little high.

But if you do think that the budget is reasonable and justified, then just leave that box blank, but please consider it before making the decision whether to leave it blank.

SLIDE 20: For noting


Now, shall we say a little bit about the Cancer Australia bit? I think some applicants may find that some people may see that the applications they are reviewing are also applying for the Cancer Australia PdCCRS funding, and so there will be an additional page that outlines the changes to the score that would meet the requirements for Cancer Australia and would then require rescoring by the reviewer.  So, it's important to understand, if you need to do that for any applications.

You might also find that there are applications that are only applying to Cancer Australia or a Cancer Council.  This shouldn't affect how you score it against the NHMRC Ideas Grants assessments criteria because that's the information that we're then providing to Cancer Australia and to Cancer Councils.

If you see any aspect of an application that you think should be looked at for eligibility, then please let NHMRC know as soon as possible, but just continue with the review in the normal way and not let any concerns about eligibility affect your fair review of the application.  Assume that we will sort the eligibility issues, you don't need to, as long as we know about them.

SLIDE 21: Assessor comments

Now, assessor comments are really critical.  We've already talked about how important they are because they'll be going to the applicants and they're designed to be useful for them.  They're also designed to explain why you've given the scores you have.  It's also a form of accountability so that everybody can see that ‑ everybody who looks ‑ that's the applicants and us ‑ will see that you have, in fact, read the application as, of course, everyone expects that you would.  So that feedback is not going to be filtered before it goes to the applicants.  It's going to go to them as it's written by you, and, as you know, it will also be seen by the other Peer Reviewers for the same application after scoring is complete.

We ask you to comment on the strengths and weaknesses of the application against each criterion, so there's 100 to 1,000 characters by criterion.  It's really important that those comments align with your scores and it's very helpful to everyone if you can be really specific and concise and think about using very clear language so that it's really helpful to the applicants when they receive it.  Of course, it's critical for everyone that your criticisms and comments are constructive and that the tone at all times is professional and objective and not personal and remembering that most applications will not be funded.  People will be deeply disappointed if they're not funded and the comments that they receive will be a very important piece of feedback which they will pore over to try to understand why their scores were as they were.

So, there's some more advice on preparing assessor comments in the Peer Reviewer support pack, so please have a look at those.

SLIDE 22: Tips for peer reviewers (1)


So just finally, a few tips: start early.  The time goes very, very quickly and we know that if you've got about 20 applications to do over four weeks, it really is ‑ we know it's a lot of work and so please don't leave it until the last minute.  Please read the Peer Review Guidelines so you really understand your process ‑ the processes and your responsibilities.  As I've said a couple of times already, think about the scheme's aims.  Look carefully at the assessment criteria and the category descriptors because they're there to guide you in the work and I hope to make it a lot easier as well to keep you on track with what we're actually looking for.  Then there's all sorts of instructions on how to navigate through Sapphire in the Peer Reviewer support pack.

SLIDE 23: Tips for peer reviewers (2)

Secondly, have a look at the Peer Review Mentor Video.  It's got some really nice tips from very experienced people.  Undertake the online Implicit Association Test and view The Royal Society video on understanding unconscious bias.  Even if you looked at that in the past, please have another look.  I think it's really important for us always to be reminded about our natural biases ‑ we all have them ‑ and to think about how we can address them.

We would like you to complete your assessments in Sapphire, and if you're not doing that for any reason, please let us know because the team here, the secretariat, will be monitoring whether people have started their reviews and they will be very, very worried if they see that no progress has been made in your reviews.  You may be working busily offline, but they won't know that, and the tension will be rising and they'll be getting in touch with you and hassling you with emails and phone calls if they don't see any progress.  So, if for any reason you do need to work offline, just please let them know.  But, if possible, do work online because that's the intent and that's the best path. And then please seek advice from the Peer Review Mentors when required.

SLIDE 24: Resources of peer reviewers

There are all sorts of resources available that are listed here: the DRC guidelines, the category descriptor rubric, the video resources and, of course, your secretariat, who are here to help and want to enable you to do a good and efficient job.  If you want to get in touch with a Peer Review mentor, do that through the secretariat and they will help you make that contact.

SLIDE 25: Peer Review Mentors

Then, finally, the Peer Review Mentors are senior researchers who are very experienced in NHMRC Peer Review.  They are available to assist you with the process and to answer generic questions ‑ not questions about the specific application ‑ the people or the science ‑ but broader questions about your Peer Review process, and they are there to help you.  We're very, very grateful to Yvette, Nick and Rosalie for being willing to help us with this.

Now I think we're going to introduce the Peer Review Mentors, the two with us today, and then we'll go on to Q&A.  So, thanks very much.  Back to you, Amanda. 


Amanda Lawrence:  Thank you very much, Anne.  I think we have Rosalie on the line with us.  I might just get Rosalie to come off camera.  There she is; wonderful.  If I could just ask Rosalie to share a little bit about your Peer Review experience and introduce yourself to the group as one of our Peer Review Mentors for this round. 


Prof. Rosalie Viney:  Hi, everyone, and can I reiterate Anne and Amanda's thanks to you all for being involved in this process.  I'm Rosalie Viney. I'm a Health Economist at the University of Technology in Sydney but today I'm actually coming to you from Lutruwita land of the Palawa people in Tasmania.  So just to give you a little bit of my experience, I have been a Peer Reviewer for NHMRC and more recently MRFF over many, many years.  I have also chaired Peer Review grant panels when they existed in the form of panels and various other activities as well, and I've previously been a member of NHMRC's Research Committee, so I guess that I've seen some of the process that we've been through and some of the improvements that Anne's just talked about in terms of the more grant‑centric approach to Peer Review, which I think is very positive, and you're the beneficiaries of.  So, thank you.

Amanda Lawrence:  Thanks very much, Rosalie.  And we also have Nicholas Talley on the line.  Nick, would you mind taking yourself off camera. 


Prof. Nicholas Talley:  Good afternoon, everyone.  I'm Nicholas Talley from the University of Newcastle.  I'm on the NHMRC Council.  I've also been a Reviewer, of course.  I've also chaired GRPs, when they existed, for some of these schemes, and I held an Investigator Grant from the NHMRC.  So, I'm looking very forward to trying to address the questions and comments that you may have with this scheme and thank you very much for all your efforts in reviewing for the NHMRC.  It's just so important.


Amanda Lawrence:  Thanks very much, Nick. 

So just to explain how our Peer Review Mentor process works. It is an opt‑in process, so if you find that you do need a bit of additional support, what we would ask that you do is if you have a question that you'd like to ask some of the Peer Review Mentors, that you email your secretariat in the first instance.  We are very conscious of our Peer Review Mentors' workloads, so we will try to triage them as best we can so that we don't overwhelm there.  If it's a process type of question that we can answer from our policy, we will try to do that in the first instance.  But if you do find that you’d like a little bit more, please get back in touch with your secretariat if you do find that that answer doesn’t satisfy what you were looking for and then at that point, we can put you in touch with one of our Peer Review Mentors.  Of course, just to reiterate as well, we do also have Yvette Roe, who wasn’t able to make it to today’s session.  So, Yvette does feature in some of our videos, certainly for the Investigator Grant scheme as well, she's provided a lot of support and advice as a Peer Review mentor over the years, and Yvette's expertise particularly in assessing the Indigenous Research Excellence Criteria.  So, I'd like to point that out too.  Unfortunately, Yvette couldn't introduce herself but that is certainly her area of expertise.

So that is how the process works.  We don't have to use it but we do want to have it there as a resource that you can use if you do need a bit of extra support with the Peer Review process as you go along and just want to seek some advice from someone who has a wealth of experience who can share some of the things that they've considered as part of their own Peer Review assessments over the years that might be helpful to you.

As Anne mentioned before as well, the Peer Review Mentors, they're not there to discuss specific elements of an application with you.  It is very much just to share some of their experience in terms of approach that might be beneficial to you as you undertake your review, particularly for those of you who are newer to Peer Review in Ideas Grants, we do find from feedback that it can take a couple of rounds to really find your feet in doing Peer Review in this scheme, so we do have that resource available to you there, should you wish to take it up.  So please do get in contact with your secretariat.

A huge thank you to everyone for your time today. Very much appreciate your involvement in the webinar. Particular thanks to Rosalie and Nick for your wise advice. Thank you very much for providing that extra support throughout this process. And thank you to Anne for battling through with the with the voice.

Prof. Anne Kelso: Thanks, everyone. Thank you very much.

Should you require a copy of the PowerPoint presentation, please email your Ideas Grants secretariat or