Recently I finished another round of GCAT review and thought of lessons learned during the meeting. Some of it is similar to another blog I wrote going to GCAT for the first time, but good to refresh those ideas…
1. Significance plays an important role. Many methods and techniques can be conceptually novel or even fancy, but if the reviewers don’t believe this will work, or answer important biomedical questions, or be used by many people in the community, they will not be the excited supporters of the proposal.
2. Track record of the team is very important. If the PI is at the forefront of the technology, publishes good work in reputable journal, or has previous developed methods that are widely used, has earlier access to unique and large datasets, it gives reviewer confidence that the proposed work will likely have an impact.
3. In the biosketch “contribution to science” section, when discussing each contribution, highlight the main idea in a phrase first before describing it in a more detailed paragraph. In fact, anywhere in the main proposal using bold, italic, color, or underline to help highlight the important points will help reviewers read and get the ideas.
4. In the specific aims page, use the first two paragraphs to establish the motivation and significance, then in each aim try to explain in as much detail as possible the research strategy, what experiments, the idea behind the algorithm, the unique clever ideas, etc. Most reviewers only have this page to refer to during the review, so make the aims description meaty for reviewers to have concrete ideas what you are going to do.
5. In the Innovation section of the proposal, educate the reviewers on the specific technological or methodological innovation, unique system or clever idea in the proposal.
6. To develop a computational proposal, the PI should know the state-of-the-art methods, demonstrate clearly the gap in existing methods, the motivation of the proposed method. Sometimes a computer- or statistics-oriented proposal could completely loose the biology reviewers. Explain the intuition or the unique cleverness behind the method, the input / output of the method, so at least reviewers roughly understand what the method is trying to do. Finally, discuss the exact deliverables and metrics to measuring success (e.g. how to evaluate whether a new parameter or new model will make the results better, systematic experimental validation, etc).
7. Some computational grants include an experimental collaborator to do validation, but sometimes the validation is just some experimental descriptions to generate more data without a clear idea how they validate or help refine the computational method. Some experimental grants will generate a lot of data, and data analysis is only added as an afterthought. Reviewers appreciate solid and coherent combination of the computational and experimental plans.
8. Large scale ($) comprehensive data generation projects without clear motivation, good biological problem, and deeper mechanistic and functional follow up plans don’t usual fair very well. Also, it is probably wise for new investigators to stick to modular budget.
9. Positively and fully address the previous reviewers’ comments in a revision. Reviewers give credit to good efforts in addressing previous reviews, especially more preliminary results and strengthened aims, which demonstrate the group is devoted to the project instead of “we will work on this crazy idea if we get $”.
10. Pay attention to grantsmanship: have logical and coherent aims, discuss work around for potential pitfalls, include support letters to demonstrate significance or expertise or cover weakness, use clear and friendly ext format, check figure and equation visibility.