HeLa cells grown in culture and stained with antibody to tubulin (green), antibody to Ki-67 (red) and the blue DNA binding dye DAPI. The tubulin antibody shows the distribution of microtubules and the Ki-67 antibody is expressed in cells about to divide.

Guest post: Ancillary criteria can make or break a borderline NIH proposal

A lab mouse sits on the gloved hand of a researcher.
Consideration for the gender of animal or human research subjects is one of several ancillary criteria PIs should address in NIH proposals to maximize their scores.

The following guest post was written by Dr. Carl Batt. Batt is Liberty Hyde Bailey Professor in Cornell University’s Department of Food Science. In his 30-year career, he has served on a number of NIH study sections. Below, Batt shares his experience with ancillary criteria that may affect how NIH proposals are scored in the review process.

Sitting on a study section (review panel) at NIH helps researchers understand (as much as possible) the nuances of how panels work and what the collective wisdom of the panel might be. For those of you not familiar, the standard practice is to have three or more preliminary reviewers whose scores form the basis for the selection of proposals that will (or will not) be discussed at the review panel. There, the lead reviewer presents the case, followed by the secondary and tertiary reviewers. If the proposal is not scored at the review panel, there is no reporting of these preliminary scores.

The following comments address a number of ancillary criteria that reviewers are asked to comment on and may (or may not) impact the preliminary scoring. Should the proposal move forward to the panel, these criteria are sometimes not discussed until after the reviewers have offered their final scoring. Each and every member of the panel (aside from those who have a conflict of interest) provide a score, which could be impacted by the discussion about these criteria.

There is no doubt that the research plan is the most important element judged by the reviewers, beyond the importance of the topic itself. There is no saving a bad idea, especially if it is deemed not to be innovative. Over the past few years a number of additional criteria have been introduced as the NIH (and other agencies) attempt to address a number of issues, some of which are external, and others internal. The following are things to keep in mind. While they will not sink a great proposal, they could potentially tank a borderline proposal. They are not scored but reviewers are asked to note compliance and challenges. What that translates into is reviewer concerns that could — if they get noticed and bother the reviewer — creep into the preliminary scoring (for the initial reviewers) and the overall impact score if your proposal gets scored.

Ancillary criterion 1: Rigor

Significant attention is being directed toward the reproducibility of research and this has crept into all science, not just that which involves qualitative, seemingly-subject experimental methods. Rigor is defined by the inclusion of controls, replicates, and other matters that are typically thought of as just “good science,” but that you might not include in an explicit manner. The Center for Scientific Review (CSR) manager (and sometimes the academic panel manager) will often remind the reviewers to comment on “rigor” and for the most part the answer is “affirmative,” meaning either there is an explicit mention of “rigor” in the experimental plan or the reviewers simply didn’t notice it. On rare occasions, the reviewer might comment about the lack of rigor and, as with everything else related to review, you are not in the room to defend your proposal. It is important that you explicitly mention controls and replicates to ensure that the matter of rigor is covered. In the past it might have been mentioned as part of a reviewer’s commentary but now it is an explicit part of the review process.

Ancillary criterion 2: Gender of animal or human subjects

If you propose to use animals or human subjects be aware there is a specific question asked: “Has the applicant considered ‘sex’ as a potential variable in their animal studies?” At a minimum you should state the gender of the animal or human subjects you will be using. It’s appropriate to state that you will use both male and female subjects,  but you should follow that statement with an additional statement that you will consider gender as a variable in your experimental plan.

Ancillary criterion 3: Milestones

Along with other agencies, the NIH has moved toward a more rigorous set of expected outcomes. Some funding opportunity announcements (FOAs) have explicit language requiring milestones, and it isn’t a bad idea to include them even in proposals for FOAs that don’t require them. The following is the language found in one NIH FOA.

Milestones. This sub-section is required for all applications. All applicants must describe here a set of discrete benchmarks that will allow unequivocal determination of the progress made towards the goals of the project. Milestones should be scientifically justified and well defined for each year of the project and be based on the proposed specific aims. Whenever feasible, milestones should provide quantitative benchmarks for comprehensively assessing the annual progress of the project. Milestones must not be simply a restatement of the specific aims. The specific aims describe the research goals of the project. Rather, the milestones should provide the means for assessing the progress made towards each aim and offer a timeline and a “pathway” for the testing of a discovery concept or development of a technology. The completion of these milestones will be used to judge the success of the proposed research on an individual-project basis.

Examples of Milestones:

  • Verify that the designed composite nanoparticles are able to reproducibly release an activated component at tumor/cancer cell sites in vivo.
  • Ascertain that a new targeted nanoparticle can specifically deliver a therapeutic agent to the tumor by demonstrating that agent concentration in tumor exceeds at least “x” times its blood concentration.
  • Demonstrate the ability of a nanoparticle diagnostic construct to detect at least “x” specific proteins in blood (out of “y” specific proteins proposed) at a femtomolar level.
  • Demonstrate the ability of the proposed nanotechnology to achieve 95% rate of capture for circulating tumor cells in blood.

Reviewers have various opinions about milestones and most don’t understand exactly what they mean. But some do, and the simple advice is to have milestones that include quantitative outcomes that can be measured and promised with some degree of temporal specificity. In other words instead of “I will build you a house,” think “I will complete your 3 bedroom, 2 bath, 2000 square foot home by April 1, 2019 for $100,000.”

Ancillary criterion 4: Resource availability

This request has to do with how you will make available some of the outputs of your research, including data, but also including physical resources such as strains, plasmids, or other things. The simplest answer is to offer everything to everyone, but the practical reality might be different. The goal is to make sure there are no doubts about the availability of resources. The section should consider what and when you might make things available and under what conditions. It is certainly acceptable to include the prerequisite that a material transfer agreement (MTA) be in place.

Ancillary criterion 5: Resource validation/concern for authenticity

This section addresses how you will know that the reagents (animals, cell lines, chemicals) you are working with are what you think they are. This stems from reports that cell lines for example are frequently not what people say (or believe) they are. “I will buy stuff from established vendors,” is an answer, but sometimes reviewers have objections to that statement and will make note of it (and sometimes that might impact their initial scoring). Even established chemical vendors (e.g., Sigma) are the subject of some derision. There is little to be done, as nobody can afford to send every chemical out for independent analysis. Certainly a statement about the certificate of identity might be worth noting. In the case of cell lines, having some markers that you state will be assayed (especially where there are specific mutations or knock-outs) is a way to address this section.

Written by Carl Batt, Liberty Hyde Bailey Professor in Cornell University’s Department of Food Science.

HeLa cell image by EnCor Biotechnology via Wikimedia Commons. Mouse photo by Rama via Wikimedia Commons. Both used under Creative Commons license.

A carryout food box containing a chicken quarter, fried plantains, a bowl of black beans and rice, and three souflee cup containers of sauce (two orange, one green).

Smorgasbord of takeaways offered at NSF Spring Grants Conference

A trailer that has been outfitted as a take-away restaurant. There is a sign reading "JJ's Take Aways" on the awning over a small seating area directly in front of the trailer. The posted menu reads as follows: Hake Chips. Hake Combo - R60. Snack & Chips. Fish Burger R14. Cheese Burger with Egg - R35. Calamarie [sic] burger - R28. Kerrie Afval Met Rys - R26-50 (Tripe). Roomys (Scoopy).

I attended the 2017 NSF Spring Grants Conference held in Louisville in June. Below are ten takeaways that I feel are worth sharing with the Miami University research community.

TAKEAWAY 1: Talk to your program officer.

“Ask early, ask often.” The NSF staff repeated this phrase like a mantra, over and over throughout the two-day conference. Contacting a program officer (PO) is something we in OARS almost always encourage investigators to do, but many seem reluctant — they don’t want to “bother” the PO. The message at the conference was that POs do not consider such contact a bother. In fact, it helps them do their jobs better. Any time they can re-direct an investigator to a more appropriate directorate or a program that’s a better fit, they’re saving themselves and their reviewers — and the investigator! — unnecessary time and effort. Any time they can coach an investigator to a stronger proposal, they’re improving the proposal pool and increasing the likelihood their program’s investments will advance the frontiers of science. That makes them look good to their bosses (and it makes their bosses look good to members of Congress, who control NSF’s future funding!).

TAKEAWAY 2: Want to write more competitive proposals? Volunteer as a reviewer.

There’s no better way to learn about what reviewers are looking for than to participate in a panel. The best time to volunteer is right after the submission deadline for a program for which your expertise is relevant. Program officers will be ready to put together the review panels for those proposals at that time, so you will be meeting a pressing need.

Learn more about becoming a reviewer here.

TAKEAWAY 3: The deadline is the deadline.

The cut-off time for submissions to most NSF programs is 5:00pm, local time for the institution. (That means that attending a conference on the West Coast on the due date does not extend a Miami researcher’s deadline to 8:00pm ET!).  FastLane is configured not to accept any additional submissions once the clock ticks over from 4:59:59pm to 5:00:00pm.

There are lots of reasons not to wait until the last minute to submit, including:

  • Lots of other people will be doing the same thing and FastLane might get bogged down. If you’re submitting late enough and the system gets bogged down enough, your submission might not go through before the clock ticks over to 5:00:00pm.
  • Even though the NSF will consider granting deadline extensions to institutions that have been affected by certain natural or anthropogenic events, there are a lot of things that could go wrong that wouldn’t fit the NSF’s criteria. For instance, one NSF staffer said, an internet outage — even one that’s campus-wide — will not earn you a deadline extension. If you’ve waited until the last minute to submit and the internet is down, you’ve missed your opportunity.
  • If you spot an error in your submission materials, it will be too late to fix it. Once 4:59:59pm has passed, you cannot withdraw your application and re-submit. On the other hand, if you submit a day or two early, you’ll have plenty of time to withdraw and re-submit, not only in the event that you spot an error, but also in the event the program officer spots an error and alerts you to it (this does happen!). Bonus takeaway: Immediately after submission, print out from FastLane what you actually submitted  (not the file from your computer you think you submitted). This lets you know right away if you accidentally submitted a file containing an outdated draft or if — as an NSF staffer said happened once — all the pages are solid orange with no visible text.

TAKEAWAY 4:  Making a connection to one of NSF’s “10 Big Ideas” could give your proposal a competitive edge.

Program officers are charged with connecting the awards they make to the “10 Big Ideas” NSF Director France Córdova laid out in testimony to Congress earlier this year. If your proposal articulates an explicit connection to one of the “10 Big Ideas” (specifically one of the six research ideas), you’ll be making the PO’s job easier. In an environment where there are insufficient funds to support all of the meritorious projects proposed, that might make the difference between being offered an award and not.

TAKEAWAY 5:  Make your “moonshot” idea a second aim.

When resources are tight, those who control the pursestrings often have little appetite for risk-taking. Taking a chance on something that’s relatively likely to fail — never mind the implicit consequences of success — seems potentially wasteful and opens decision-makers up to being second-guessed. It’s much safer to bet on the anodyne project highly likely to result in incremental progress.  For that reason, Jennifer Weller, a Program Director in the Division of Biological Infrastructure (DBI), suggests those seeking support for a “moonshot” idea make that idea a second aim in a proposal in which the first aim is more conservative. This approach virtually guarantees reviewers and POs a return on investment, while also opening the door to the possibility of transformational change.

TAKEAWAY 6: Broader Impacts are the NSF’s hedge against the risk inherent to science.

When it comes to Broader Impacts, one of the NSF’s two review criteria (along with Intellectual Merit), Weller suggested it might be useful to investigators and proposal writers to think again in terms of risk. There’s always a chance, however small, that the science might not “work out” in a funded project. Broader Impacts — benefits to society that derive from the project — are a parry against that possibility. They’re what Congress and U.S. taxpayers get for their money even when science is not progressed.

Bonus takeaway for applicants to DBI: Weller reported that Broader Impacts are assigned a weight of about 30% in the review of proposals submitted to DBI (with the exception of CAREER proposals, where Broader Impacts are weighted at 50%). Note that this relative weighting varies from division to division and directorate to directorate.

TAKEAWAY 7:  Cost sharing is bad, and investigators don’t always recognize it. But NSF does. (And so does OARS!)

To keep all applicants on a level playing field, the NSF prohibits any institution from volunteering to assume responsibility for any of the costs related to a project. (Some programs, like the Major Research Instrumentation or MRI program, are exceptions and actually require a specified percentage of cost share from the institution). The problem is that not all investigators realize what constitutes cost sharing. The rule is that any cost-generating item or activity mentioned in the project narrative must be associated with a reasonable cost in the budget. Any cost-generating item or activity that is not included in the budget or for which the budget indicates the cost, or any portion of the cost, will be borne by the institution — including a reduced indirect cost rate — is cost share. (Note that this does not include resources discussed in the Facilities, Equipment, and Other Resources section of the narrative. Costs for items and activities discussed in this section are assumed to be included in the institution’s indirect cost rate.)

TAKEAWAY 8:  “Participant support” might not mean what you think it means.

In an NSF budget, participant support costs are (only) those costs related to travel and subsistence for participants in conferences and training activities. Entertainment is not an allowable expense. Speakers and faculty leaders of conferences and training activities are not participants. A student can be a participant or an employee on a project, but not both. If your proposal does not include conducting a conference or training activities, there should be no participant support costs included in your budget. Incentives offered to human subjects who participate in studies are not participant support costs.

TAKEAWAY 9:  When it comes to expenses, document, document, document.

NSF’s Office of the Inspector General regularly performs audits of institutions and projects that receive grant funds. To avoid negative findings in the event that your grant is audited, make sure to carefully document all expenses charged to the grant and any relevant special circumstances. The look-back period for audits is three years, so if there are details — reasoning, conversations, special circumstances — related to expenses you think you won’t remember in three years’ time, make a note of them when they occur and keep that note with your records for the project.

TAKEAWAY 10:  Remember to submit your reports.

There are very real consequences to failing to submit required reports. For one thing, if a report is overdue, all administrative actions — including change in scope and change in project personnel — are blocked for the project. More importantly, PIs and co-PIs on projects with overdue reports are blocked from receiving further funding from NSF. One program officer related an anecdote about having extra money available to fund a project on his “wait list” after an award he’d made had been declined. He was especially enthusiastic about the proposed project and was looking forward to making the $1 million award. That all ended when he looked the PI up in the system and discovered a report from another of the PI’s projects was overdue, meaning that the PI was not eligible to receive additional funds.

If you have any questions about these ten takeaways — or anything else about NSF grants — ask them in the comments or contact your OARS representative.

Written by Heather Beattey Johnston, Associate Director of Research Communications, Office for the Advancement of Research and Scholarship, Miami University.

Takeaway food photo by Dan Reed via Flickr. JJ’s Take Aways photo by SA-Venues.com via Flickr. Both used under Creative Commons license.

Two researchers in lab coats look through separate lenses on the same microscope.

Former NIH staffer reveals grant review process

A stylized representation of two people, one of whom has a speech bubble with the word YES! and the other of whom has a speech bubble that says NO?


“No amount of grantsmanship will turn a bad idea into a fundable one . . . but there are many outstanding ideas that are camouflaged by poor grantsmanship.”

-William Raub, Past Deputy Director of the NIH

The mission of the National Institutes of Health (NIH) is to seek fundamental knowledge that will enhance health, lengthen lifespan, and reduce illness and disability. Yearly, Congress provides funding so that NIH can help meet this mission and its goals of supporting creative discoveries and innovative research. While innovation and significance are core to the NIH review process, researchers must instill a high degree of confidence that they have the training, experience, methodology, and supportive institutional environment to be successful in their research endeavors.

On Thursday, November 3, Dr. Norman Braveman, Miami alumnus, former senior member of the NIH extramural program, and current current President of Braveman BioMed Consultants, shared with faculty in Miami University’s Department of Psychology the importance of understanding the NIH review process for successful grantsmanship. Not only should a grant proposer understand the review process, they should also understand the mission of the NIH and most importantly of the institute or center (IC) to which they are applying. Braveman suggested going to the IC website to get a snapshot of the IC strategic plan prior to formulating project objectives so as to focus your research on areas of current importance to the mission of the IC.

Braveman began by providing a 50,000-foot overview of the NIH grant review process. Most applications are assigned to the Center for Scientific Review (CSR), where they undergo initial scientific merit review. Some applications may undergo initial peer review in a specific IC, as specified in the funding opportunity announcement (FOA). Understanding where your proposal will be reviewed (CSR or IC) and by whom (e.g., scientific review group) can help you target your application accordingly.

A flow chart describing NIH review process. The chart starts with Researcher idea and Institution. An solid arrow then leads to Grant Application (R01, R03, R21, K01, K08, etc.). A solid line leads to NIH/CSR Referral and Review. From there, there are two solid arrows. The first leads to IC/Program, which is the end of that chain. The second arrow leads to Initial Peer Review CSR or IC. A solid arrow leads from Initial Peer Review CSR or IC to Review Summary Statement. There are two solid arrows leading from Review Summary Statement. The first leads to PO/Applicant, and there is a dotted line that leads from there to Secondary Review National Advisory Council. The second solid arrow leads from Review Summary Statement to IC Decision Process. From IC Decision Process, there are two options: 1 - funded (represented by a handful of cash) and 2 - unfunded (represented by a thumbs-down). From unfunded, a solid arrow labeled revision leads back to the start - Researcher idea and institution.
This flow chart from Dr. Braveman’s presentation shows how the NIH review process works.

While NIH review criteria include significance, investigator(s), innovation, approach, and environment, Braveman suggested that the un-scored “Specific Aims” section is the most important part of any NIH grant. It is in this section that researchers tell the reviewers about the purpose and importance of their project. The specific aims may determine whether a reviewer wants to read more or set the proposal aside — which equates to getting triaged or not discussed and/or scored.

If the specific aims are clear, concise, and attention-grabbing, then the proposal may be discussed by the study section. It is important to note that any proposal that has been triaged can be pulled into the discussion by any of the reviewers serving on the study section. Proposals that are discussed receive a score from 1- 9, with 1 being “exceptional” and 9 being “poor.” Average scores are multiplied by 10, then scores from all proposals from that review round are percentile-ranked.

Table describing the NIH review scoring system. The first column is Score, the second Descriptor and the third Additional Guidance on Strengths/Weaknesses. Data in the rows are as follows: 1/Exceptional/Exceptionally strong with essentially no weaknesses. 2/Outstanding/Extremely strong with negligible weaknesses. 3/Excellent/Very strong with only some minor weaknesses. 4/Very Good/Strong but with numerous minor weaknesses. 5/Good/Strong but with at least one moderate weakness. 6/Satisfactory/Some strengths but also some moderate weaknesses. 7/Fair/Some strengths but with at least one major weakness. 8/Marginal/A few strengths and numerous major weaknesses. 9/Poor/Very few strengths and numerous major weaknesses.
This table from Dr. Braveman’s presentation describes the NIH review scoring system.

Once proposals are scored and ranked, they go to the National Advisory Council for second-level review. This council reviews the nature of the proposals submitted to see how they fit into the objectives of the NIH and into the objectives of the current IC. Considerations such as how project ideas may work synergistically to answer larger questions are taken into account in making funding decisions. Therefore, it may be the case that one proposal that doesn’t score as well as another may get funded because it addresses a question that no other proposal addresses and is currently important to the mission of the IC.

Another important component in the NIH decision making process is the program officer (PO). The PO interacts with the National Advisory Council and should be able to speak on behalf of any given proposal. It is important to contact the PO early in the proposal development phase to ensure:

  • Your proposal objectives match the current program or IC objectives
  • The PO understands your objectives in case he or she needs to advocate to the National Advisory Council

In addition to establishing a working relationship with the PO, other things to keep in mind when writing the NIH proposal include:

  • NIH and IC objectives
  • Reviewers read 20-25 applications three times a year, so yours needs to be meaningful and well-written
  • You should write to the NIH review criteria: significance, investigator(s), innovation, approach, and environment
  • Use the biosketch to help establish capabilities of the investigator(s)
  • Use the “Resources” section to elaborate on the appropriateness of the environment

Remember that when writing an NIH — or any — grant proposal, as Francis Bacon said, “knowledge is power.” Be certain to do your homework and understand the funding agency, as well as the review criteria and review process to increase your chances of funding success.

Written by Tricia Callahan, Director, Office for the Advancement of Research and Scholarship, Miami University.

Microscope image by the National Cancer Institute, via Wikimedia Commons, used under public domain. YES!/NO? illustration by I and Materia via Wikimedia Commons, used under Creative Commons license. Flow chart and table illustrations by Dr. Norm Braveman, used with permission.

Closeup of a keyboard. The following keys are visible: 5, 6, 7, 8, 9, T, Y, U, I, H, J, K, N, M.

Proposal development workshop to be led by former NIH staffer

Index card with the following text handwritten: Imagining writing amazingly. I front-load important info so that people can decide if it's worth their time. I untangle thoughts and make them easy to follow. I make it easy for people to read at the level of detail they want. I save people time by drawing on multiple sources and my experiences, while letting them go back and explore the context. I share practical everyday things that people can apply. I don't fake authority. I am honest and approachable. Sometimes I'm mistaken, and that's okay. I resonate with people. I draw out what other people know so that we can share it with even more people. Everything is archived, searchable, and maybe even organized. A doodle illustrates each "I" statement.


Workshop: Beyond the Basics with Dr. Norm Braveman — May 2-3

Dr. Norman (Norm) Braveman, graduate of Miami University, former member of the senior NIH staff, and founder and President of Braveman BioMed Consultants, will be on campus Monday, May 2 and Tuesday, May 3 to share a “Beyond the Basics” workshop.

During his 30-year career at the NIH, Dr. Braveman made significant contributions to the extramural community. His career in the NIH extramural research program spanned peer review of investigator-initiated clinical trials with the National, Heart, Lung and Blood Institute; extramural science program development with the National Institute on Aging and the National Institute of Dental and Craniofacial Research; and agency-wide science program planning and evaluation in the NIH Office of the Director.

In the workshop, Dr. Braveman will offer an approach to maximizing funding potential while incorporating grant writing into an academic career.

Miami University researchers may register for any or all of the following sessions being held on Monday, May 2 by clicking on the session title. Registrations will be accepted until noon on Friday, April 29. All sessions take place in 257 Garland Hall.

  • Grant Writing & Your Academic Career (8:30-10:00am) – Geared toward graduate students, post docs, and junior faculty, this 90-minute session will focus on integrating grant writing into an academic career of teaching, research, and scholarship.
  • Know Your Benefactor (10:30am-12:00pm) – Successful grant applications are not only well written; they are also targeted to an appropriate funding source and should be written with the goals and objectives of the funder in mind. The focus of this session is on how agencies decide future direction and funding priorities. While the focus will be on the NIH application, the information presented is relevant to many funding agencies. Intended for graduate students, post docs, faculty, and proposal-writing staff.
  • The NIH Review Process (1:30-2:30pm) – Dr. Braveman will elucidate the peer review process at the NIH, including how proposals are reviewed and scored, as well as what reviewers look for in assessing the scientific merit of a proposal. Participants will learn about the review process at the NIH and can ask specific questions of Dr. Braveman about his experience as a former NIH staffer. Intended for graduate students, post docs, faculty, and proposal-writing staff.
  • Maximizing Your Grant Success: A Strategic Approach to Grant Writing (3:00-5:00pm) – This session will focus on writing a reviewer-friendly application. It will include grant-writing basics and tips on the DOs and DON’Ts of grant writing. Intended for graduate students, post docs, faculty, and proposal-writing staff.

Those who will be attending all sessions being held Monday, May 2 may also register for lunch.

On Tuesday, May 3, Dr. Braveman will offer one-on-one meetings with interested Miami researchers. During their meeting, each researcher will have the opportunity to ask Dr. Braveman questions about NIH and other grant-related matters. Registrations will be accepted until noon on Friday, April 29. All meetings will take place in 257 Garland Hall. Those who are working on a current NIH proposal may submit a draft to Anne Schauer by 5:00pm on Friday, April 15 to receive feedback from Dr. Braveman during their meeting. To register, click on the desired time slot:

Keyboard image by Henry Schimke via Flickr. Writing amazingly image by by Sacha Chua via Flickr. Both used under Creative Commons license.

James H. Shannon Building (Building One), NIH campus

Questions about NIH proposal evaluation answered

Three people site behind a table. The one in the middle holds up a card that says, "4. I like it."

While the process for proposal evaluation at the NIH is transparent and outlined on the NIH website, the steps and expectations can be overwhelming for those unfamiliar with NIH or its processes. A recent NIH webinar demystified the process as outlined below:

What happens to my application after it is submitted to NIH?
A majority of applications submitted to the NIH are assigned to the Center for Scientific Review (CSR). The CSR checks each application for completeness and assigns applications to a specific NIH Institute or Center (I/C). Applications are then assigned to a Scientific Review Group (SRG) or review committee that will evaluate the proposal based on NIH review criteria. Read more about application receipt and referral here.

On what criteria is my application evaluated?
Proposals are scored based on 5 core review criteria:

Significance– When ascertaining a proposal’s significance, reviewers ask questions such as, “Will the proposed work have a sustained and powerful impact on the field?” “Should this work be done and why?” and “Does the project address an important problem or barrier to progress within a certain filed?” and then assigns a score based on how well the proposal answers these questions.

Investigator(s)– What abilities, qualifications, and training do the investigators have to conduct the proposed work? There is an expectation that investigators demonstrate a record of success as evidenced by publications and prior funding. Some leeway is given for new and early stage investigators. New Investigators are those investigators who have never received an NIH R01 research grant, while early stage investigators (ESIs) are new investigators who have completed the terminal degree within the last 10 years. NIH is committed to accelerating the transition from new investigator to independent researcher, thus new investigators and ESIs do not have to demonstrate the same amount of prior success in order to receive NIH funding. If applicable be sure your NIH eRA Commons profile is up-to-date in order to reflect your status as a new investigator or ESI!

Innovation– The criterion of innovation addresses how well an application challenges or seeks to shift current research or procedures. Reviewers look to see if concepts, approaches, and methodologies are novel.

Approach– Is the proposed work appropriate in scope? Is it realistic? Are pitfalls and limitations anticipated, and if so, is an alternate plan laid out to address potential setbacks? It’s here that a sound evaluation plan becomes important to ensure the work is moving along in the proposed direction.

Environment– Are there adequate resources and institutional support for carrying out the proposed work?

Additional considerations include protections for human subjects; inclusion of women, minorities, and children; appropriate use of vertebrate animals; and management of biohazards.

What else do reviewers look for?
While all proposals are evaluated and scored on the five NIH core criteria, reviewers also look for: clear objectives with an obvious impact on the field; exciting ideas; realistic aims and timelines; brevity on obvious things; noted limitations; and a clear, well-written application that is free of grammatical errors.

How is my application scored?
Applications are scored on a scale from 1-9 as follows:

1- Exceptional
2- Outstanding
3- Excellent
4- Very Good
5- Good
6- Satisfactory
7- Fair
8- Marginal
9- Poor

After review, the scores of individual reviewers are averaged and that average is multiplied by 10 to give impact scores ranging between 10 (high impact) and 90 (low impact). Note: Only applications that are discussed are given impact scores. Reviewers may not discuss an application if they believe it is not meritorious enough to warrant discussion.

Where can I track my application status?
Grant status can be tracked via eRA Commons. eRA Commons contains the following: a PDF file of the submitted application; contact information for the Program Officer (PO), Scientific Review Officer (SRO), and Grants Management Specialist (GMS); council meeting dates; scientific review group; study roster; status history (including dates); funding outcome; summary statement; and award number, if applicable.

Who do I contact for assistance?

Before you submit your proposal: The Program Official (PO) is responsible for the programmatic, scientific, and technical aspects of a grant. If you have questions about the relevance of your work to the program, questions about the program not addressed in the announcement, or questions regarding the most appropriate study section for your application, contact your PO.

After you submit and prior to review: The Scientific Review Officer (SRO) is responsible for the scientific and technical review of proposals. The SRO is the point of contact for applicants during the review process.

After review (if funded): The Grants Management Specialist (GM) is responsible for the business management requirements of the award. You may also need to contact your Program Official if you need to request changes to your personnel, budget, or scope of work after an award has been issued.

After review (if not funded): After you’ve read your summary statement, you may want to talk to your PO about revising and resubmitting your application.

What is the typical timeline between submission and award?
For most applications, it takes about 9-10 months between proposal submission to receiving an NOA (Notice of Award).

If I am not funded, will I receive feedback regarding my application?
The NIH summary statement contains scores for each of the five review criteria, critiques from assigned reviewers, and a summary discussion of the overall review. For those proposals receiving an impact score, the summary statement will also show the overall impact score and percentile ranking. For those proposals not discussed, no overall impact score is given. Summary statements may also contain recommendations of the study section, a recommended budget, and additional administrative notes.

How can I learn more about the NIH Review Process?
Videos and information on the NIH review process are available on the NIH website. 
Additionally, depending on your accomplishments and expertise in a given area, you can become an NIH reviewer. Becoming a reviewer gives you valuable experience, as well as an insider’s perspective on the NIH review process. Learn more about becoming an NIH reviewer here.

Written by Tricia Callahan, Director of Proposal Development, Office for the Advancement of Research & Scholarship, Miami University.

Photo of James H. Shannon Building (Building One), NIH campus by Lydia Polimeni, National Institutes of Health, via Flickr. Score photo by uncoolbob via Flickr. Both used under Creative Commons license