NAACLS News









News



SEARCH:

 

JUMP:

National Accrediting
Agency for Clinical
Laboratory Sciences
5600 N River Rd
Suite 720
Rosemont, IL 60018

773.714.8880
773.714.8886 (FAX)

info@naacls.org
http://www.naacls.org


NAACLS logo



Get Acrobat Reader





Archives









Volume 81 - Spring / Summer 2002



Evaluating Programs Through a Formal Process of Data Collection, Analysis and Modification
by Karen M. Myers MA, MT(ASCP)SC, CLS(NCA)
Programs Approval Review Committee Member

"Begin at the beginning, " the King said gravely, "and go on till you come to the end: then stop."

-Lewis Carroll (1985)

All programs accredited or approved by NAACLS are involved in program evaluation, whether informally or formally. Every time we change a course component, respond to needs of a clinical site, make adjustments in curricular schedules, we engage in the process of evaluation. We collect data. We modify our courses, the curriculum, or even the entire program. We review the results of such change and begin the process of evaluation anew. In my work with CLS/MT and CLT/MLT accreditation self-studies and site visits and with the PARC program approvals, I ask programs to describe the significant changes they have implemented. Program directors can always describe these changes and they know clearly why a change was made. I then ask to review the documentation that demonstrates the history of the changes. Effective program evaluation requires that the process be documented in some systematic way. Without a systematic documentation of data collected, analysis of the data, and a summary of the program's response to findings, evaluation remains informal.

Demonstrating that program evaluation is reflected in curriculum development and program modification and is tied to ongoing data collection and analysis is the responsibility of all programs. Program evaluation requires documentation that ties changes to analysis. Several Standards explicitly focus self-studies on explanations of the results of survey instruments such as those the program collects from students, employers, faculty and graduates. They also focus on analyses of outcome measures such as certification examinations or capstone projects. Implicit throughout program Standards is a much larger issue of continuous program evaluation and improvement. This needs to be clearly addressed by both writers and reviewers. Program evaluation is the umbrella under which outcome measures fall. It includes the processes by which a program decides to make modifications in policy or curriculum.

"Evaluation is the systematic process of collecting and analyzing data in order to make.decision(s) [regarding program operations], .or to determine whether and to what degree objectives have been or are being achieved." (p.4)1 In evaluation we compare our findings "to a set of explicit or implicit standards, as a means of contributing to the improvement of the program." (p.4)2 Thus, a systematic method of evaluation necessitates a formalized, documented process. A good reminder regarding the preparation for accreditation and review remains the old adage already quite familiar to all health care practitioners, "If it isn't documented, it didn't occur."

While faculty and officials new to a program may focus on putting in place or continuing an effective evaluation process, there are many purposes for evaluation beyond those expected by accrediting agencies. These include program justification, fulfilling funding requirements, collecting data for anticipated program justification and funding, supporting public relations efforts, enhancing organizational learning, and the ongoing improvement or change of a program 1, 2 The benefits of an effective evaluation process provide a program with recorded data that can support the "value of .[program] activities, the effectiveness of.processes, and their impact on the people involved and the organization." (p.22)1

While Lewis Carroll admonishes us to begin at the beginning and go to the end and then stop, educators know there isn't an ending. A circle captures the nature of systematic, ongoing evaluation. (See Figure 1 on Page 7.)

In reality it is a cyclical process. We never return to the same place twice because the program changes and we learn from our analysis of that change. If each step in the evaluation cycle is documented, there will be evidence of systematic evaluation for whatever purposes the program may wish to use the process.

Program evaluation begins with a philosophy of evaluation and a documented process by which that evaluation is to be accomplished. This requires faculty to engage in conversations about evaluation. These discussions regarding the program and its processes can become a component of regularly scheduled faculty or department meetings. Because formal evaluation requires assessment against appropriate standards, faculty can begin to examine their program through the lens of their evaluation philosophy, the procedures they use for evaluation, the standards chosen for comparison, and the anticipated outcomes.

Evaluation philosophy is both program and institution specific. The process of developing a philosophy provides participants the opportunity to look at how the current program is being evaluated and what might be gained from changing that process. The newest NAACLS Standards for CLS and CLT require that programs establish an advisory committee. A program's evaluation philosophy may be reflected in the composition of its advisory committee, that is, by selecting individuals who represent those constituents from whom it is important to receive feedback. Such constituents include other educators, employers, community representatives, physicians, health care administrators, and students.

When educators involve an advisory committee in the evaluation process, they open up a program to those measures that may be critical to a program's ongoing success within its community. This may not always be apparent to program faculty. Determining what data to collect should be a locally directed effort even while evaluators remain sensitive to national trends and needs of the profession.

Data analysis involves the evaluation of both program processes and outcome measures. "Outcomes are what occur as a direct result of an action (that is, training, services and teaching), usually measured immediately after an activity has been performed."(p.25)1 Processes are the source of action, how we reach an outcome. Outcome measures such as performance of graduates on external certifying examinations or capstone projects are directly addressed in one of the NAACLS Standards. Graduation and placement rates of students, addressed in another Standard, may be established as outcome measures for a program, or they may be reviewed as program processes. For example, a program that uses graduation rates as an outcome measure could also look at the attrition and success rates of graduates. If graduation rates are examined as a program process, a program might examine how the curriculum is able to accommodate students and insure the highest graduation rates possible. Student and employer satisfaction surveys can be used as outcome measures in cases where the program is attempting to achieve a certain level of satisfaction within each group. The same surveys can be used to evaluate process if questions are included regarding specific course content and the ways the course prepared, or failed to prepare, students to enter clinical rotations. Curricula can be modified as a result of data collected from employers when questions solicit information regarding the preparedness of students to enter a specific clinical field.

In summary, effective programs actively engage in evaluation, and they document the process of evaluation for internal retrospective analysis. They do this in order to share findings with outside constituents. By collecting data, analyzing and responding to the data collected, and evaluating the results of change, programs tie decisions to historical factors studied. Evaluation remains a potent tool for program improvement and program justification. The self-study process of accreditation and approval is an important component of a larger evaluation process that involves both formal and informal components. NAACLS program accreditation and approval require faculty to provide examples of a significant change resulting from program evaluation. It also requires an analysis of the effectiveness of that change. Programs that engage in active cycles of evaluation will have many such examples from which to choose.

References

  1. Boulmetis, J, Dutwin, P. The ABCs of Evaluation: Timeless Techniques for Program and Project Managers. San Francisco: CA: Jossey-Bass Publishers; 2000.
  2. Weiss, CH. Evaluation: Methods for Studying Programs and Policies. Upper Saddle River, NJ: Prentice Hall; 1998.








2002 NAACLS Meetings

Evaluating Programs Through a Formal Process of Data Collection, Analysis and Modification
by Karen M. Myers MA, MT(ASCP)SC, CLS(NCA)
Programs Approval Review Committee Member

Human Genetics: Health Professions Curricula
by Dr. David D. Gale
Member, NAACLS Board of Directors

Identifying Instructional Areas in CLS/MT Standards
by Shauna C. Anderson, PhD
Secretary, NAACLS Board of Directors

Outcomes Assessment: Its Importance to Programmatic Accreditation
by Kathy V. Waller, PhD, CLS(NCA)
President, NAACLS Board of Directors



*Note: this information has been supplanted by more recent policy changes.
Please view

Adding Clinical Affiliates

CEO's Corner
NAACLS Elections Held
by Olive M. Kimball, PhD, EdD
NAACLS Chief Executive Officer

NAACLS Initially Accredited and Approved Programs
April 2002

President's Report
NAACLS and the CHEA Recognition Process
by Kathy V. Waller, PhD, CLS(NCA)
President, NAACLS Board of Directors

Programs to be Site Visited
Fall 2002/Winter 2003 Cycle






Select an Issue     


Top

Copyright © 2008 National Accrediting Agency for Clinical Laboratory Sciences. All rights reserved.
Comments or suggestions to the site editor.





NAACLS.org Programs Students Volunteers Committees Help Accreditation Approval News About Us Search Links Home