Program Evaluation Essentials at the Department of State

Collecting Data for Program Evaluation
at the Department of State
By Dean Olsen
Social Science Analyst
Evaluation and Measurement Unit
US Department of State
A data collection overview
Evaluation Design, Question(s), and Logic Models
Data Integrity
Differences in Data
Data Collection Methods and Analysis
Considerations for selecting data collection
methods
Data Collection for Public Diplomacy Evaluations
at the Department of State
Evaluation designs, questions, and
logic models
The purpose for the evaluation
Who is the audience?
How will the evaluation be used?
Evaluation question(s)
Types of evaluation designs
Randomized experiments
Quasi-experimental 
Non-experimental Pre/Post-Test
Non-experimental Time Series
Analytical framework – Logic models
Program Logic Model
Your Planned Work
Your Intended Results
Resources/
Inputs
Activities
Output
Outcome
Impact
What does success look like?
Evaluation Logic Model
Evaluation
Question
Type of
Answer/
Evidence
Needed
Method for
Data
Collection
Sampling/
Selection
Approach
Data
Analysis
Method
Method
Source
Survey
Interview
Focus Group
Observation
Test/Exam
Document Review
Participants
Parents
Teachers
Trainers
Stakeholders
Program documents
Data Integrity
Reliability
Is the ability to produce consistent results and indicates
whether an evaluation is likely to produce the same results
if conducted again.
Validity
Is about coming as close to accurately measuring
something as possible. For example, if you want to assess a
child’s math skills, the math test you administer to them
must be validated such that it is measuring the child’s
actual math ability and not something else, such as their
test taking skills.
Source:  Corporation for National and Community Service,
http://www.nationalservice.gov/sites/default/files/resource/Description_of_Audio_Data_Collectio
n_for_Program_Evaluation.pdf
Differences in Data
Secondary and Primary Data
Data some else collected
Data you collect
Disadvantages
Advantages
Quantitative Data
Is information about quantities; that is, information
that can be measured and written down with or
expressed by numbers.
Qualitative Data
Is information that cannot be expressed by numbers
and is descriptive in nature.
Data Collection Methods and Analysis
Quantitative
Methods – Surveys, Polls, Questionnaires
Data someone else has already collected – Agency
databases, public opinion polls, program performance
statistics/metrics, publications, journals, books,
magazines, etc.
Analysis – Statistics, graphs, charts
Qualitative
Methods – Structured and unstructured interviews,
focus groups, open ended questionnaires,
observation, document reviews
Analysis – Content analysis, word clouds
Considerations for selecting data
collection methods
Evaluation Question(s)
Nature, scope, and purpose of inquiry
Type of evaluation design
Availability of funds
Time factor
Precision required
Data collection for Public Diplomacy
Evaluations at the State Department
Stakeholders and Contacts
Understanding the culture, audience and
environment
Scheduling, Relationship Building, and
Snowballing
Third Party Vendors
Who else do you think I should be talking to?
Questions?
 
Slide Note
Embed
Share

This collection covers the essentials of program evaluation at the Department of State, including data collection methodologies, evaluation designs, logic models, data integrity, differences in data types, and the importance of reliability and validity in evaluations. It delves into the intricacies of data collection for public diplomacy evaluations and provides insights into quantitative and qualitative data distinctions.

  • Program Evaluation
  • Data Collection
  • Evaluation Design
  • Logic Models
  • Data Integrity

Uploaded on Sep 26, 2024 | 0 Views


Download Presentation

Please find below an Image/Link to download the presentation.

The content on the website is provided AS IS for your information and personal use only. It may not be sold, licensed, or shared on other websites without obtaining consent from the author. Download presentation by click this link. If you encounter any issues during the download, it is possible that the publisher has removed the file from their server.

E N D

Presentation Transcript


  1. Collecting Data for Program Evaluation at the Department of State By Dean Olsen Social Science Analyst Evaluation and Measurement Unit US Department of State

  2. A data collection overview Evaluation Design, Question(s), and Logic Models Data Integrity Differences in Data Data Collection Methods and Analysis Considerations for selecting data collection methods Data Collection for Public Diplomacy Evaluations at the Department of State

  3. Evaluation designs, questions, and logic models The purpose for the evaluation Who is the audience? How will the evaluation be used? Evaluation question(s) Types of evaluation designs Randomized experiments Quasi-experimental Non-experimental Pre/Post-Test Non-experimental Time Series Analytical framework Logic models

  4. Program Logic Model Your Planned Work Your Intended Results Resources/ Inputs Activities Output Outcome Impact What does success look like?

  5. Evaluation Logic Model Evaluation Question Method for Data Collection Data Analysis Method Type of Answer/ Evidence Needed Sampling/ Selection Approach Source Method Survey Interview Focus Group Observation Test/Exam Document Review Participants Parents Teachers Trainers Stakeholders Program documents

  6. Data Integrity Reliability Is the ability to produce consistent results and indicates whether an evaluation is likely to produce the same results if conducted again. Validity Is about coming as close to accurately measuring something as possible. For example, if you want to assess a child s math skills, the math test you administer to them must be validated such that it is measuring the child s actual math ability and not something else, such as their test taking skills. Source: Corporation for National and Community Service, http://www.nationalservice.gov/sites/default/files/resource/Description_of_Audio_Data_Collectio n_for_Program_Evaluation.pdf

  7. Differences in Data Secondary and Primary Data Data some else collected Data you collect Disadvantages Advantages Quantitative Data Is information about quantities; that is, information that can be measured and written down with or expressed by numbers. Qualitative Data Is information that cannot be expressed by numbers and is descriptive in nature.

  8. Data Collection Methods and Analysis Quantitative Methods Surveys, Polls, Questionnaires Data someone else has already collected Agency databases, public opinion polls, program performance statistics/metrics, publications, journals, books, magazines, etc. Analysis Statistics, graphs, charts Qualitative Methods Structured and unstructured interviews, focus groups, open ended questionnaires, observation, document reviews Analysis Content analysis, word clouds

  9. Considerations for selecting data collection methods Evaluation Question(s) Nature, scope, and purpose of inquiry Type of evaluation design Availability of funds Time factor Precision required

  10. Data collection for Public Diplomacy Evaluations at the State Department Stakeholders and Contacts Understanding the culture, audience and environment Scheduling, Relationship Building, and Snowballing Third Party Vendors Who else do you think I should be talking to?

  11. Questions?

Related


More Related Content

giItT1WQy@!-/#giItT1WQy@!-/#giItT1WQy@!-/#