Posted on Leave a comment

Guest Post: True Tales of a Department of Education Grant Reviewer

In “Why Winning an Olympic Gold Medal is Not Like Getting a Carol M. White Physical Education Program (PEP) Grant,” Isaac wrote: “Many grant applicants are under the delusion from years of watching the Olympics and similar sports competitions that, if their application receives the highest review score, the grant will automatically be awarded.”

One of our faithful readers wrote in with this tale of grant reviewer woe, which has been anonymized to protect the innocent:

I was a Federal Reviewer for [a Department of Education program] a few years ago. Our team of three reviewers met via phone conference to discuss the grants after we had each read and scored them independently. Amazingly, we pretty much gave each grant a similar score—within 5 – 10 points of each other, which surprised me. I remember the one that we gave the lowest score to. It was awful. The project didn’t even ‘hit’ on the required elements of the RFP and what they proposed to implement didn’t fit at all with [the subject of the program].

None of us had given them a score over 50. The moderator still asked us to discuss the score before we moved on, so we did. None of us wanted to change anything about their score in any way. When the list came out, two of the proposals we had reviewed were funded. One that had scored in the high 90’s, and the one with the lowest score. I can’t remember the location of the low scoring one, although it was somewhere out East.

I do know that the moderator told us that the competition was pretty tough (she had been a moderator in the past) and unless an applicant scored close to 100, they probably wouldn’t get funded. The moderator never mentioned anything about geographic distribution in any of our discussions, but it was in the RFP—I had gone back to check.

The review process is pretty anonymous and cut and dried. It’s done through the e-portal at e-grants, and there’s really no way to go back in and talk to the moderator or the other reviewers once the decisions are made. I do remember being hacked off about that one, though. So much so, that I didn’t go back and apply to be a reviewer for them again the next year.

(Compare this chilling real-life story to the one about how RFPs get written in “Inside the Sausage Factory and how the RFP Process leads to Confused Grant Writers“.)

This story also demonstrates why we don’t read reviewer comments on clients’ previous proposal submissions—or our own. Although the three reviewers on this program mostly agreed with each other, there’s no guarantee that three reviewers next year will agree on the same kinds of criteria or be concerned with the same kinds of things. And even if they do, other considerations often outweigh what the reviewers want.

Still, you should strive to produce the best proposal you can: notice that one funded proposal in our reader’s story did score very highly. You’re always better off with a clear, concise, well-written proposal than an incomplete, poorly written proposal that relies on improbable assistance from reviewers or decision makers that might not come through for you, even if it does for others. You don’t know who else is applying, what their proposals look like, or what non-explicit factors the funding agency is really considering.

Leave a Reply

Your email address will not be published. Required fields are marked *