|
|
|
Evaluation Studio
Evaluation is a powerful tool in planning and implementing programs. With an emphasis on improving student achievement, evaluation needs to be an ongoing, integrated part of instructional programs. The Evaluation Studio provides three different avenues for assistance, depending on the individual’s and the organization’s needs:
- a beginning tutorial on evaluation terms and concepts
- tip sheets on how to design an evaluation tool based on needs
- examples of effective evaluation tools, plans and reports.
Learner Assessment Fact Sheet
Learner assessment is the process of collecting and recording information about student knowledge and skills and then interpreting that information against instructional objectives and standards of quality. This Fact Sheet further delineates essential components and different types of learner assessment, how those assessments are used and ways to improve assessment.
Research Fact Sheet
Research is a systematic process of collecting and analyzing objective evidence in order to establish facts and reach new conclusions, thus increasing common knowledge and understanding in the area of interest or concern. This Fact Sheet provides detail on the design of research methods and the research process itself.
Program Evaluation Fact Sheet
Program evaluation is a systematic process of gathering objective evidence about a program using that evidence to make judgments about the merit or worth of the program. This Fact Sheet expands on this definition, outlining steps of the evaluation process and conveying how evaluation information is used.
Evaluation Fact Sheet
Adding to the definition of evaluation set forth in the Program Evaluation Fact Sheet, this Fact Sheet clarifies how program evaluation is different from research and assessment. This Fact Sheet includes answers to questions of when to conduct an evaluation, who should conduct an evaluation, how much should be budgeted for evaluation, and what should be done with evaluation findings.
Logic Models Fact Sheet
A logic model is a picture of how a program works. Logic models can be used throughout the life of a program. This Fact Sheet shows what a logic model looks like, what its basic components are and what possible expanded configurations can be included.
Evaluation Tools Decision Tree and Tip Sheets
Four basic evaluation tools are highlighted: interviews, focus groups, surveys and observations. Each of these tools follows a certain format and offers specific benefits. Through a series of questions the Decision Tree can help to determine which evaluation tool should be used in a given situation. The Tip Sheets present clear steps for using each of the different evaluation tools.
Developing Evaluation Questions Tip Sheet and Essential Steps
Good evaluation planning begins with good evaluation questions. The Tip Sheet for developing evaluation questions provides guidance in forming significant, focused, assessable questions for evaluation. Essential to the process of developing those evaluation questions are reviewing program objectives, assessing relevancy, obtaining input, using the evaluation matrix and reexamining the questions.
Embedding Evaluation Guidelines
By embedding evaluation in the processes of planning, implementing and reporting on the program, time can be saved, funds can be conserved and the quality of evaluation findings can be improved. This document provides guidelines for embedding evaluation throughout the program development cycle.
Evaluation Matrix Tip Sheet and Template
In order to plan and focus an evaluation process, an evaluation matrix should be used. This Tip Sheet and Template are valuable tools in getting a matrix started. After following steps on the Tip Sheet to identify key questions, objectives, indicators, data sources, due dates and responsible personnel, pertinent information can be plugged into the Template. This process enables a program’s evaluation to become more easily embedded.
Scaling Tip Sheet
Scaling is a process of designing the options to a survey question, and the numerical value to these responses. This tip sheet explains the reasons for using different point scales from a four-point to an eight-point scale.
|
|
|
|