Enhancing Scientific Review Processes in Research: Insights from the National Institute of Justice
Phyllis Newton, Office Director at the National Institute of Justice, presents strategies for stronger, transparent, diverse, and efficient scientific reviews. The presentation covers annual review challenges, pilot projects, panel models, merit review processes, and funding decisions.
Download Presentation
Please find below an Image/Link to download the presentation.
The content on the website is provided AS IS for your information and personal use only. It may not be sold, licensed, or shared on other websites without obtaining consent from the author. Download presentation by click this link. If you encounter any issues during the download, it is possible that the publisher has removed the file from their server.
E N D
Presentation Transcript
Presentation by Phyllis Newton Office Director, Office of Research and Evaluations National Institute of Justice
Stronger Science Greater Transparency Greater Consistency Increased Diversity High Quality Feedback Cost-Effective and Cost-Efficient Safeguards against Bias and COIs Effective, Timely, Concise, Consistent Input
Annual Reviews Changes across review cycles Identity of reviewers not disclosed Mixed panels of researchers and practitioners Substantial labor on: Selecting complete panels Contacting, confirming, scheduling
Pilot project: 5 scientific review panels in 2012 Testing a single model for review panels Drawing on review processes at NIH, NSF Final design currently being finalized Process open to input
SOLICITATION APPLICANT NIJ DIRECTOR FUNDING DECISION NIJ RECEIPT SCIENTIFIC REVIEW PANEL PROGRAMMATIC REVIEW
Standing Panel model Three-year rolling membership 15 researchers and 3 practitioners Possibility for additional ad hoc reviewers
Initial Merit Review Panel Review Documentation Recommendations to NIJ
Each application reviewed twice Reviewed by two scientific members of panel Reviewers provide written comments for each proposal Score individual elements defined in solicitation Provide overall score Scores above median subject to secondary review Exceptions Abstract and written comments to all members of scientific review panel
In-person meeting for all members Entire proposals provided upon request Readers from merit review present overview Discussion Members provide written overall impact score Outlier scores discussed Funding recommendations determined
Reviewer A provides comments and scores Reviewer B provides comments and scores Reviewer A provides summary report Additions/subtractions from leader review Minority report Submit scores and reports to NIJ program manager
1.0 TO 1.5 1.6 TO 2.0 2.1 TO 2.5 2.6 TO 3.0 3.1 TO 4.0 4.1 TO 5.0 Outstanding Excellent Good Very good Fair Poor SCORE RANGE ADJECTIVAL EQUIVALENT
NIJ substantive experts review scientific review panel recommendations Provide written determination if substantive experts disagree with panel recommendation Provide panel and substantive expert recommendations to NIJ office director Provide office-level recommendations to NIJ director Funding decision
JUNE SEPTEMBER Stand-up 5 Review Panels MAY NIJ Conference Announcement NIJ Review MAY MARCH Proposals Submitted JUNE AUGUST Review and Comment Period NIJ Director Review/Funding Decision MARCH Proposals to Scientific Panel JUNE Awards Processed SEPTEMBER Incorporate Changes APRIL SEPTEMBER Awards Presented Scientific Review Panel Meets
Qualitative Process Review for Pilot Review NIJ Receipt Focus Group for Scientific Merit Review Review Scientific Review Panel Review Panel Recommendation Focus Group for Scientific Review Panel Review NIJ Expert Review Review Office Director Briefing Review NIJ Director Briefing