Logic Models and Evaluation Metrics for Successful Grant Proposals

 
OFFICE OF
FOUNDATION
RELATIONS
 
FOUNDATION RELATIONS
TEAM
 
Aaron Shonk, Senior Director
Aaron.Shonk@osufoundation.org
Ph:7-6961
 
Elizabeth Ocampo, Coordinator
Elizabeth.Ocampo@osufoundation.org
Ph:7-7362
 
Paul DuBois, Associate Director
Paul.DuBois@osufoundation.org
Ph:7-3755
 
CO-PRESENTER
 
Mary Ellen Dello Stritto, Ph.D.
Assistant Director, Ecampus Research Unit
 
The Ecampus Research Unit (ECRU) conducts original research, creates and
validates instruments, supports full-cycle assessment loops for internal
programs, and provides resources to encourage faculty research and external
grant applications related to online teaching and learning.
 
Logic Models, Metrics, and
Evaluations
 
Faculty Grant Training
February 22, 2019
 
OSU FOUNDATION
 
3
 
Why Evaluations?
 
Measurable Results
 
Evaluation Plan
Checklists
 
Logic Models
 
Data Collection and
Sampling
 
Conclusion
 
Agenda
 
 
Sponsors want to know if
the project they funded
worked or not.
 
OSU FOUNDATION
 
4
 
Why Evaluations?
 
Why Evaluations
 
Measurable Results
 
Evaluation Plan
Checklists
 
Logic Models
 
Data Collection and
Sampling
 
Conclusion
 
Agenda
 
EVALUATION METRICS
Your grant will be judged by the
measurable outcomes you propose.
Proposals without clear evaluation
metrics may be seen as ill-conceived or
poorly designed. Depending on the
sponsor, your proposal may be
disqualified.
 
OSU FOUNDATION
 
5
 
Measurable Results
 
NOT MEASURABLE
“Expand student horizons.”
 
MEASURABLE
“Provide two experiential learning opportunities, (e.g.
study abroad, laboratory research with a faculty mentor,
work study, or internships) to every undergraduate
zoology student who graduates in 2020.”
 
OSU FOUNDATION
 
6
 
Why Evaluations
 
Measurable Results
 
Evaluation Plan
Checklists
 
Logic Models
 
Data Collection and
Sampling
 
Conclusion
 
Agenda
 
EVALUATING THE EVALUATION
Each funder -- and even each grant
program -- will have different metrics to
determine whether you’ve met the
requirements for evaluating your
program.
Read the grant guidelines for
indicators.
 
OSU FOUNDATION
 
7
 
Evaluation Plan Checklists
 
OSU FOUNDATION
 
8
 
https://www.escardio.org/static_file/Escardio/Subspecialty/
Councils/CCNAP/Documents/grant_proposal_checklist.pdf
 
https://sponsoredprograms.ek
u.edu/sites/sponsoredprogram
s.eku.edu/files/files/forms/Pro
posal%20Review%20Checklist.
docx
 
Examples of how
evaluation plans can
be reviewed
 
ONLINE EVALUATION RESOURCE LIBRARY
https://oerl.sri.com/home.html
 
Online resource with evaluation examples and
instruments for several content areas.
 
OERL is designed to help current and aspiring principal investigators and
evaluators learn more about how project evaluations have been planned,
implemented, and reported within the context of EHR programs. Additional
audiences include NSF program officers, evaluation professionals, and
evaluation training programs.
 
OSU FOUNDATION
 
9
 
Why Evaluations
 
Measurable Results
 
Evaluation Plan
Checklists
 
Logic Models
 
Data Collection and
Sampling
 
Conclusion
 
Agenda
 
W.K. KELLOGG FOUNDATION
“…a logic model is a systematic and visual
way to present and share your
understanding of the relationships among
the resources you have to operate your
program [or research], the activities you
plan, and the changes or results you hope to
achieve.”
 
OSU FOUNDATION
 
10
 
Logic Models
 
OSU FOUNDATION
 
11
 
https://www.bttop.org/sites/default/files/public/
W.K.%20Kellogg%20LogicModel.pdf
 
Logic Model Examples
 
OSU FOUNDATION
 
12
 
UWisconsin – Extension
Templates
Examples
 
Building a Logic Model
 
Logic models are read from left to right, but often
created in the reverse order.
CONSIDER:
1.
What will be evaluated/measured
a.
What level of data is needed?
b.
How will data be collected?
2.
Select indicators of progress
a.
Objectives met?
b.
Data collected?
 
OSU FOUNDATION
 
13
 
Building a Logic Model
 
1.
Layout a simple logic model
2.
Fill in activities, outputs, and outcomes
3.
Then sequence by connecting program components to
outputs and outcomes
4.
Use arrows, or other visuals to indicate relationships
5.
Revise!
 
Other resources: American Evaluation Association –
www.eval.org
 
OSU FOUNDATION
 
14
 
Why Evaluations
 
Measurable Results
 
Evaluation Plan
Checklists
 
Logic Models
 
Data Collection and
Sampling
 
Conclusion
 
Agenda
 
EXERCISE
 
Search for logic model image examples
in your discipline
Ex. “logic models chemistry”
 
OSU FOUNDATION
 
15
 
Logic Models
 
OSU FOUNDATION
 
16
 
Why Evaluations
 
Measurable Results
 
Evaluation Plan
Checklists
 
Logic Models
 
Data Collection and
Sampling
 
Conclusion
 
Agenda
 
KEY CONSIDERATIONS
1.
Lay out why your sample
population is the right one to
answer your research questions
2.
Explain why, when, and how the
data will be obtained
 
OSU FOUNDATION
 
17
 
Data Collection and Sampling
 
OSU FOUNDATION
 
18
 
QUESTIONS?
Slide Note

Must be in Slide Master mode to swap out photos.

Embed
Share

This content provides insights into logic models, evaluation metrics, and the importance of measurable outcomes in grant proposals. It emphasizes the significance of clear evaluation plans to showcase project impact and meet sponsor expectations. The material also includes examples of measurable vs. non-measurable objectives and highlights the need for thorough evaluation checklists. Overall, it serves as a comprehensive guide for crafting successful grant applications.

  • Grant Proposals
  • Logic Models
  • Evaluation Metrics
  • Measurable Outcomes
  • Grant Funding

Uploaded on Sep 21, 2024 | 0 Views


Download Presentation

Please find below an Image/Link to download the presentation.

The content on the website is provided AS IS for your information and personal use only. It may not be sold, licensed, or shared on other websites without obtaining consent from the author. Download presentation by click this link. If you encounter any issues during the download, it is possible that the publisher has removed the file from their server.

E N D

Presentation Transcript


  1. OFFICE OF FOUNDATION RELATIONS

  2. FOUNDATION RELATIONS TEAM Aaron Shonk, Senior Director Aaron.Shonk@osufoundation.org Ph:7-6961 Paul DuBois, Associate Director Paul.DuBois@osufoundation.org Ph:7-3755 Elizabeth Ocampo, Coordinator Elizabeth.Ocampo@osufoundation.org Ph:7-7362

  3. CO-PRESENTER Mary Ellen Dello Stritto, Ph.D. Assistant Director, Ecampus Research Unit The Ecampus Research Unit (ECRU) conducts original research, creates and validates instruments, supports full-cycle assessment loops for internal programs, and provides resources to encourage faculty research and external grant applications related to online teaching and learning.

  4. Logic Models, Metrics, and Evaluations Faculty Grant Training February 22, 2019 OSU FOUNDATION 3

  5. Why Evaluations? Agenda Why Evaluations? Measurable Results Sponsors want to know if the project they funded worked or not. Evaluation Plan Checklists Logic Models Data Collection and Sampling Conclusion OSU FOUNDATION 4

  6. Measurable Results Agenda Why Evaluations EVALUATION METRICS Measurable Results Your grant will be judged by the measurable outcomes you propose. Evaluation Plan Checklists Logic Models Proposals without clear evaluation metrics may be seen as ill-conceived or poorly designed. Depending on the sponsor, your proposal may be disqualified. Data Collection and Sampling Conclusion OSU FOUNDATION 5

  7. NOT MEASURABLE Expand student horizons. MEASURABLE Provide two experiential learning opportunities, (e.g. study abroad, laboratory research with a faculty mentor, work study, or internships) to every undergraduate zoology student who graduates in 2020. OSU FOUNDATION 6

  8. Evaluation Plan Checklists Agenda Why Evaluations EVALUATING THE EVALUATION Measurable Results Each funder -- and even each grant program -- will have different metrics to determine whether you ve met the requirements for evaluating your program. Evaluation Plan Checklists Logic Models Data Collection and Sampling Conclusion Read the grant guidelines for indicators. OSU FOUNDATION 7

  9. Examples of how evaluation plans can be reviewed https://sponsoredprograms.ek u.edu/sites/sponsoredprogram s.eku.edu/files/files/forms/Pro posal%20Review%20Checklist. docx https://www.escardio.org/static_file/Escardio/Subspecialty/ Councils/CCNAP/Documents/grant_proposal_checklist.pdf OSU FOUNDATION 8

  10. ONLINE EVALUATION RESOURCE LIBRARY https://oerl.sri.com/home.html Online resource with evaluation examples and instruments for several content areas. OERL is designed to help current and aspiring principal investigators and evaluators learn more about how project evaluations have been planned, implemented, and reported within the context of EHR programs. Additional audiences include NSF program officers, evaluation professionals, and evaluation training programs. OSU FOUNDATION 9

  11. Logic Models Agenda Why Evaluations W.K. KELLOGG FOUNDATION Measurable Results a logic model is a systematic and visual way to present and share your understanding of the relationships among the resources you have to operate your program [or research], the activities you plan, and the changes or results you hope to achieve. Evaluation Plan Checklists Logic Models Data Collection and Sampling Conclusion OSU FOUNDATION 10

  12. https://www.bttop.org/sites/default/files/public/ W.K.%20Kellogg%20LogicModel.pdf OSU FOUNDATION 11

  13. Logic Model Examples UWisconsin Extension Templates Examples OSU FOUNDATION 12

  14. Building a Logic Model Logic models are read from left to right, but often created in the reverse order. CONSIDER: 1. What will be evaluated/measured a. What level of data is needed? b. How will data be collected? 2. Select indicators of progress a. Objectives met? b. Data collected? OSU FOUNDATION 13

  15. Building a Logic Model 1. Layout a simple logic model 2. Fill in activities, outputs, and outcomes 3. Then sequence by connecting program components to outputs and outcomes 4. Use arrows, or other visuals to indicate relationships 5. Revise! Other resources: American Evaluation Association www.eval.org OSU FOUNDATION 14

  16. Logic Models Agenda Why Evaluations EXERCISE Measurable Results Evaluation Plan Checklists Search for logic model image examples in your discipline Ex. logic models chemistry Logic Models Data Collection and Sampling Conclusion OSU FOUNDATION 15

  17. OSU FOUNDATION 16

  18. Data Collection and Sampling Agenda Why Evaluations KEY CONSIDERATIONS 1. Lay out why your sample population is the right one to answer your research questions 2. Explain why, when, and how the data will be obtained Measurable Results Evaluation Plan Checklists Logic Models Data Collection and Sampling Conclusion OSU FOUNDATION 17

  19. QUESTIONS? OSU FOUNDATION 18

More Related Content

giItT1WQy@!-/#giItT1WQy@!-/#giItT1WQy@!-/#giItT1WQy@!-/#giItT1WQy@!-/#giItT1WQy@!-/#giItT1WQy@!-/#giItT1WQy@!-/#giItT1WQy@!-/#