0% Plagiarism Guaranteed & Custom Written

evaluation conducted, what decisions are being aided by the findings of the evaluation

01 / 10 / 2021 Others

This paper circulates around the core theme of evaluation conducted, what decisions are being aided by the findings of the evaluation together with its essential aspects. It has been reviewed and purchased by the majority of students thus, this paper is rated 4.8 out of 5 points by the students. In addition to this, the price of this paper commences from £ 99. To get this paper written from the scratch, order this assignment now. 100% confidential, 100% plagiarism-free.

You will design a Public Health Program Evaluation Document. The
document will be a continuation of the grant research students
developed. Be sure to develop an evaluation plan to ensure your program
evaluations are carried out efficiently in the future and to ensure your
evaluation plan is documented so you can regularly and efficiently carry
out your evaluation activities.

Plans must include the following sections:

Title Page (name of the organization that is being, or has a
product/service/program that is being, evaluated; date)
Table of Contents
Executive Summary (one-page, concise program background , type of
evaluation conducted, what decisions are being aided by the findings of
the evaluation, who is making the decision
Engagement of Stakeholders
All groups identified – those involved in program operations; those
served or affected by the program; and primary intended users of the
evaluation.
Address rights of human subjects, human interactions, conflict of interest
Cultural competency addressed
Description of the Program
Statement of need -describes the problem, goal, or opportunity that the
program addresses; the nature of the problem or goal, who is affected,
how big it is, and whether (and how) it is changing; problem/opportunity
to which program is responding to, the program’s specific objectives
Expectations – program’s intended results-what the program has to
accomplish to be considered successful; background about organization
and program that is being evaluated; organization description/history.
Activities – everything the program does to bring about changes-
describe program components, elements, strategies, and actions;
principal content of the program, delivery model
Resources- include the time, talent, equipment, information, money, and
other assets available to conduct program activities include program
costs and the cost-benefit ratio as part of the evaluation; staffing
(description of the number of personnel and roles in the organization
that are relevant to developing and delivering the program
Program’s stage of development – reflects program maturity; address
three phases of development: planning, implementation, and effects or
outcomes; program documentation.
Program’s context – the environment in which the program operates; the
area’s history, geography, politics, and social and economic conditions,
and what other organizations have done.
Logic model – sequence of events input, output, and long-term goals,
short-term; flow-chart, map, or table to portray the sequence of steps
leading to program results; clear description of program inputs,
activities/processes; clear description of outcomes and impact.
Evaluation Design
Purpose – general intent of the evaluation. (gain insight, improve how
things get done, determine what the effects of the program are, affect
those who participate)
Users – specific individuals who will receive evaluation findings.
Uses – what will be done with what is learned from the evaluation.
Answer specific questions- clarity and appropriateness of research
questions, including which stakeholders will utilize the answers
Methods-(experimental, quasi-experimental, and observational or case
study designs)
Agreements- summarize the evaluation procedures and clarify everyone’s
roles and responsibilities; describe how the evaluation activities will
be implemented.
Addresses evaluation impact ,practical procedures, political viability,
cost effectiveness, service orientation, complete and fair assessment,
fiscal responsibility .
Gathering Evidence
Indicators – general concepts about the program and its expected effects
into specific, measurable parts; description of independent and
dependent variables and how they will be measured and analyzed.
Sources of evidence -people, documents, or observations, criteria used
to select sources clearly stated; who participated.
Quality -the appropriateness and integrity of information gathered
Quantity-the amount of evidence gathered
Logistics – the methods, timing, and physical infrastructure for
gathering and handling evidence time periods sampled, data collection
method and tools, any limitations caused by this method and tool
Address Information scope and selection; information sources, valid and
reliable information
Justification of Conclusions
Standards -the values held by stakeholders about the program.
Analysis and synthesis -methods to discover and summarize an
evaluation’s findings; detect patterns in evidence, by analysis,
synthesis, mixed method
Interpretation – figure out what the findings mean; interpretations and
conclusions
Judgments – statements about the merit, worth, or significance of the
program, compare against one or more selected standards.
Recommendations-actions to consider as a result of the evaluation;
recommendations regarding the decisions that must be made about the
service/program.
Addresses identification of values, analysis of quantitative and
qualitative information
Use and Dissemination of Lessons Learned
Design -how the evaluation’s questions, methods, and overall processes
are constructed.
Preparation – steps taken to get ready for the future uses of the
evaluation findings.
Feedback – communication that occurs among everyone involved in the
evaluation.
Follow-up – the support that users need during the evaluation and after
they receive evaluation findings.
Dissemination – the process of communicating the procedures or the
lessons learned from an evaluation to relevant audiences in a timely,
unbiased, and consistent fashion.
Reflection on Standards for “good” evaluations
Utility Standards

Stakeholder Identification: People who are involved in (or will be
affected by) the evaluation should be identified, so that their needs
can be addressed.
Evaluator Credibility: The people conducting the evaluation should be
both trustworthy and competent, so that the evaluation will be generally
accepted as credible or believable.
Information Scope and Selection: Information collected should address
pertinent questions about the program, and it should be responsive to
the needs and interests of clients and other specified stakeholders.
Values Identification: The perspectives, procedures, and rationale used
to interpret the findings should be carefully described, so that the
bases for judgments about merit and value are clear.
Report Clarity: Evaluation reports should clearly describe the program
being evaluated, including its context, and the purposes, procedures,
and findings of the evaluation. This will help ensure that essential
information is provided and easily understood.
Report Timeliness and Dissemination: Significant midcourse findings and
evaluation reports should be shared with intended users so that they can
be used in a timely fashion.
Evaluation Impact: Evaluations should be planned, conducted, and
reported in ways that encourage follow-through by stakeholders, so that
the evaluation will be used.
Feasibility Standards

Practical Procedures: The evaluation procedures should be practical; to
keep disruption of everyday activities to a minimum while needed
information is obtained.
Political Viability: The evaluation should be planned and conducted with
anticipation of the different positions or interests of various groups.
This should help in obtaining their cooperation so that possible
attempts by these groups to curtail evaluation operations or to misuse
the results can be avoided or counteracted.
Cost Effectiveness: The evaluation should be efficient and produce
enough valuable information that the resources used can be justified.
Propriety Standards

Service Orientation: Evaluations should be designed to help
organizations effectively serve the needs of all of the targeted
participants.
Formal Agreements: The responsibilities in an evaluation (what is to be
done, how, by whom, when) should be agreed to in writing, so that those
involved are obligated to follow all conditions of the agreement, or to
formally renegotiate it.
Rights of Human Subjects: Evaluation should be designed and conducted to
respect and protect the rights and welfare of human subjects, that is,
all participants in the study.
Human Interactions: Evaluators should respect basic human dignity and
worth when working with other people in an evaluation, so that
participants do not feel threatened or harmed.
Complete and Fair Assessment: The evaluation should be complete and fair
in its examination, recording both strengths and weaknesses of the
program being evaluated. This allows strengths to be built upon and
problem areas addressed.
Disclosure of Findings: The people working on the evaluation should
ensure that all of the evaluation findings, along with the limitations
of the evaluation, are accessible to everyone affected by the
evaluation, and any others with expressed legal rights to receive the
results.
Conflict of Interest: Conflict of interest should be dealt with openly
and honestly, so that it does not compromise the evaluation processes
and results.
Fiscal Responsibility: The evaluator’s use of resources should reflect
sound accountability procedures and otherwise be prudent and ethically
responsible, so that expenditures are accounted for and appropriate.
Accuracy Standards

Program Documentation: The program should be described and documented
clearly and accurately, so that what is being evaluated is clearly
identified.
Context Analysis: The context in which the program exists should be
thoroughly examined so that likely influences on the program can be
identified.
Described Purposes and Procedures: The purposes and procedures of the
evaluation should be monitored and described in enough detail that they
can be identified and assessed.
Defensible Information Sources: The sources of information used in a
program evaluation should be described in enough detail that the
adequacy of the information could be assessed.
Valid Information: The information gathering procedures should be chosen
or developed and then implemented in such a way that they will assure
that the interpretation arrived at is valid.
Reliable Information: The information gathering procedures should be
chosen or developed and then implemented so that they will assure that
the information obtained is sufficiently reliable.
Systematic Information: The information from an evaluation should be
systematically reviewed and any errors found should be corrected.
Analysis of Quantitative Information: Quantitative information – data
from observations or surveys – in an evaluation should be appropriately
and systematically analyzed so that evaluation questions are effectively
answered.
Analysis of Qualitative Information: Qualitative information –
descriptive information from interviews and other sources – in an
evaluation should be appropriately and systematically analyzed so that
evaluation questions are effectively answered.
Justified Conclusions: The conclusions reached in an evaluation should
be explicitly justified, so that stakeholders can understand their worth.
Impartial Reporting: Reporting procedures should guard against the
distortion caused by personal feelings and biases of people involved in
the evaluation, so that evaluation reports fairly reflect the evaluation
findings.
Metaevaluation: The evaluation itself should be evaluated against these
and other pertinent standards, so that it is appropriately guided and,
on completion, stakeholders can closely examine its strengths and
weaknesses.

Marginal
1 Acceptable
2 Commendable
3 Outstanding
4
Title page, Contents & Summary
X1.5 Title Page, Table Of Contents And Executive Summary not complete
Title Page, Table Of Contents And Executive Summary minimally addressed,
generic Title Page, Table Of Contents And Executive Summary completely
addressed – all items covered in detail Title Page, Table Of Contents
And Executive Summary complete, detailed, summary demonstrates thought.
Stakeholders
X2 Limited; stakeholders appear to be understood but some links are not
often evident. Stated yet is imprecise. Imprecise.-some information
expressed clearly and critiqued. Some supporting evidence is used.
Clearly stated. Analysis is significant. Information expressed clearly
and analyzed. Most have supporting evidence. Stakeholders appear to be
understood but some links are not often evident. Specific and connected,
clear and logical. Information is clearly and precisely stated. Analysis
is thorough, logical, and significant. Information on stakeholders are
expressed clearly and critiqued with supporting evidence.
Description
X2 Generic description – all areas not addressed. Statement of need,
expectations, activities, resources, stage of development and program
context addressed. Logic model explained and represented in flowchart,
map, or table. Some understanding is evident. Needs some clarification.
Statement of need, expectations, activities, resources, stage of
development and program context addressed. Logic model explained and
fully represented in flowchart, map, or table. Demonstrates
understanding. Purpose and description is evident and articulated.
Statement of need, expectations, activities, resources, stage of
development and program context addressed. Logic model explained in
detail and fully represented in flowchart, map, or table. Demonstrates
clear understanding. Purpose and description is evident and well
articulated.
Evaluation Design
X2 Unclear and/or have some ambiguity. Accuracy of statements is not
clear. Detail is needed to provide exactness in meaning. Purpose, users,
uses described. Specific questions addressed. Explains some methods and
agreements, mentions evaluation activities .Does not addresses all
areas: evaluation, impact procedures, effectiveness, and viability All
information not directly related to the topic. Purpose, users, uses
described. Specific questions addressed with clarity and
appropriateness. Mostly explains methods and agreements, describes how
the evaluation activities will be implemented. Addresses evaluation,
impact procedures, effectiveness, and viability. Purpose, users, uses
described. Specific questions addresses with clarity and
appropriateness. Completely explains methods and agreements, describes
how the evaluation activities will be implemented. Addresses evaluation,
impact procedures, effectiveness, and viability in detail. Clearly
addresses.
Gathering Evidence
X2 Unclear or not stated. Supporting evidence is often missing. Detail
is needed to provide exactness in meaning. Indicators & sources of
evidence mostly addressed. Quality, quantity, and logistics vaguely
discussed in detail. Limitations, scope, and selection mentioned.
Supporting evidence is usually present. Indicators & sources of evidence
addressed. Quality, quantity, and logistics discussed . Addresses
limitations, scope, and selection. Relevant examples explain most ideas.
Supporting evidence is present Indicators & sources of evidence clearly
addressed. Quality, quantity, and logistics discussed in detail.
Addresses limitations, scope, and selection in detail. Supporting
evidence is present. Consistently relevant detail makes the meaning exact.
Justifications & Conclusions
X1 Unclear or not stated. Supporting evidence is often missing. Detail
is needed to provide exactness in meaning. All areas not addressed
Recommendation weak. Addresses standards. Provides generic analysis and
synthesis, and interpretations. Does not clearly substantiate judgments.
Provides basic recommendations not fully linked to evaluation Addresses
standards. Provides analysis and synthesis, some interpretations and
does substantiate judgments. Provides recommendations based on
evaluation Addresses standards. Provides clear analysis and synthesis,
solid interpretations and substantiate judgments. Provides detailed
recommendations based on evaluation.
Use & Dissemination
x1 Unclear or not stated. Supporting evidence is missing. Detail is
needed to provide exactness in meaning. All areas not addressed.
Minimally identifies uses in design and preparation. Provides some
feedback. Lists basic follow-up ideas. Generic ideas for dissemination.
Completely identifies uses in design and preparation. Delineates
feedback among everyone involved in the evaluation. Lists substantiated
follow-up ideas. Relevant ideas for dissemination. Clearly identifies
uses in design and preparation. Delineates detailed feedback among
everyone involved in the evaluation. Lists strong follow-up ideas.
Creative and relevant ideas for dissemination.
Reflection on Standards
X2 Mentions without reflection. No demonstration of understanding and
application Addresses some standards; Utility, Feasibility, Propriety,
and Accuracy. Reflects generically with little understanding and
application Addresses all standards; Utility, Feasibility, Propriety,
and Accuracy. Reflects on all components addressing strengths and
weaknesses, Demonstrates an understanding and application Fully
addressees all standards; Utility, Feasibility, Propriety, and Accuracy.
Reflects on each component addressing strengths and weaknesses,
Demonstrates clear understanding and application
Organization Mechanics and Grammar



International House, 12 Constance Street, London, United Kingdom,
E16 2DQ

Company # 11483120

Benefits You Get

  • Free Turnitin Report
  • Unlimited Revisions
  • Installment Plan
  • 24/7 Customer Support
  • Plagiarism Free Guarantee
  • 100% Confidentiality
  • 100% Satisfaction Guarantee
  • 100% Money-Back Guarantee
  • On-Time Delivery Guarantee
FLAT 50% OFF ON EVERY ORDER. Use "FLAT50" as your promo code during checkout