The timing of data collection is critical. For programs targeting individual behaviour change, evaluating adherence over time is an important component of measuring impact. Most experience suggests that data should be collected twice after implementation – at six months and 12 months. Organizational change lags behind and some effects may not be seen for several years.
There are ethical issues associated with some evaluation designs and methods, particularly if using previously collected data such as insurance claims or attendance records that were not collected for the programmatic purpose. Similarly, some employees may be hesitant to provide personal data if they think it might be used against them in the future when promotions are being considered.
When determining the evaluation instruments for data collection, first look for those that already exist, have been validated and have been found to be reliable. Ensure the information being sought through the tool will be accessible to the workplace being studied.
It’s not easy to undertake workplace health evaluation effectively. Workplaces are diverse in terms of the demographics of the workforce, the complexities of the organizational structure, size and the mix of issues and programs underway. This makes it challenging to undertake conventional health promotion evaluation strategies. This can be compounded if the committee is trying to evaluate the impact across settings – from the workplace into the community and to the employee’s home. One way to alleviate some of the challenges is to utilize a Participatory Action Research approach. This engages the participants from the diverse population of the workplace in shaping the evaluation plan – from setting the research questions to identifying the most appropriate means of collecting data to analyzing results to communicating the results.
Communicating evaluation results
The data has been collected and analyzed appropriately. It’s now time to interpret the results and to communicate these to the various stakeholders.
Interpret and disseminate results
The format of the evaluation results will depend on the audience who receives it. The Kellogg’s Foundation offers the following suggestions in their Evaluation Toolkit[1]:
Tips on Writing Effective Reports
Know who your audience is and what information they need.
Relate evaluation information to decisions.
Start with the most important information.
Develop concise reports by writing a clear abstract and starting each paragraph with the most important point.
Write short focused paragraphs.
Highlight important points.
Do not use professional jargon or difficult vocabulary.
Use active verbs.
Have your report edited, looking for unnecessary words and phrases.
Be creative and innovative in reporting evaluation findings. Use a variety of techniques such as visual displays, oral presentations, summary statements, interim reports and informal conversations.
Write and disseminate a complete evaluation report, including an executive summary and appropriate technical appendices.
Write separate executive summaries and popular articles using evaluation findings, targeted at specific audiences or stakeholder groups.
Write a carefully worded press release and have a prestigious office or public figure deliver it to the media.
Hold a press conference in conjunction with the press release.
Make oral presentations to select groups. Include demonstration exercises that actively involve participants in analysis and interpretations.
Construct professionally designed graphics, charts, and displays for use in reporting sessions.
Make a short video or audiotape presenting the results, for use in analysis sessions and discussions.
Stage a debate or advocate-adversary analysis of the findings in which opposing points of view can be fully aired.
Below is a typical outline for the presentation of evaluation results:
Title
Executive summary
Introduction
Program rationale and logic
Description of the initiative/program
Evaluation methods
Research/findings
Discussion
Conclusion
Acknowledgements
References
Appendices
Best practices for workplace mental health promotion evaluation
The World Health Organization offers the following best practices (adapted for mental health promotion) for the monitoring and evaluation of workplace health promotion programs[2]:
Include outcome indicators on each level of formative, process, intermediate and impact evaluation.
Outcome indicators should directly be related to and be dependent on intervention components and objectives. They should be a logical consequence of decisions made in each level of evaluation.
An extensive process evaluation should always be included; qualitative information on program preparation and implementation will provide useful information on how to make the programs more successful.
A measure to quantify the inside (worksite) or outside environment should be included. An increasing number of such instruments are currently being developed for research purposes.
Use of validated and shorter Internet or Intranetquestionnaires to decrease improve data management and to increase the response rate of subjects.
Use innovative objective instruments to monitor and evaluate.
The inclusion of an extensive and expensive set of biological indicators may not be necessary or feasible. In correspondence with program components and objectives, a relatively small set of feasible and less expensive biological indicators might be sufficient in practice.
In combination with the continuous monitoring of sick leave at most worksites, regular (yearly) health check-ups of employees should likewise be incorporated in company health policy. Including the recommended small set of biological indicators and/or a questionnaire into a yearly check-up could:
give insight into long term in health changes;
automatically measure health changes due to newly implemented policy or intervention elements;
provide insight into efficacy of interventions so that health trends can be anticipated;
be utilised as a bench-mark of company`s health policy;
provide continuous data flow making cost-benefit analysis achievable;
contribute to the employee’s perception of the commitment of the company to occupational health management.
Information:
Evaluating CWHP Info-Pack, by THCU, contains an overview of process and outcome methods appropriate for evaluating comprehensive workplace health promotion (CWHP); steps for developing and implementing CWHP evaluations; and a sample CWHP logic model. See http://www.thcu.ca/workplace/documents/EvaluationInfoPackFinalWeb.pdf.
Prevention and Promotion in Mental Health, by the World Health Organization, provides information on a variety of evaluation methods specific to mental health promotion. See http://www.who.int/mental_health/media/en/545.pdf.
Factors to Consider When Deciding on an Evaluation Type(link to the tables below). This tool created by THCU will help determining which type of evaluation is best suited for a particular program.
Element 8: Evaluating CWHP Efforts
Collect data
Data collection tips:
It’s not easy to undertake workplace health evaluation effectively. Workplaces are diverse in terms of the demographics of the workforce, the complexities of the organizational structure, size and the mix of issues and programs underway. This makes it challenging to undertake conventional health promotion evaluation strategies. This can be compounded if the committee is trying to evaluate the impact across settings – from the workplace into the community and to the employee’s home. One way to alleviate some of the challenges is to utilize a Participatory Action Research approach. This engages the participants from the diverse population of the workplace in shaping the evaluation plan – from setting the research questions to identifying the most appropriate means of collecting data to analyzing results to communicating the results.
Communicating evaluation results
The data has been collected and analyzed appropriately. It’s now time to interpret the results and to communicate these to the various stakeholders.
Interpret and disseminate results
The format of the evaluation results will depend on the audience who receives it. The Kellogg’s Foundation offers the following suggestions in their Evaluation Toolkit[1]:
http://www.wkkf.org/Default.aspx?tabid=90&CID=281&ItemID=2810045&NID=2820045&LanguageID=0
Below is a typical outline for the presentation of evaluation results:
Best practices for workplace mental health promotion evaluation
The World Health Organization offers the following best practices (adapted for mental health promotion) for the monitoring and evaluation of workplace health promotion programs[2]:
Information:
Tools:
Factors to Consider When Deciding on an Evaluation Type
Table 1 Commonly Used Qualitative Methods:
Table 2 Commonly Used Quantitative Methods
References
[1] Kellogg’s Foundation, “Evaluation Handbook,” http://www.wkkf.org/knowledge-center/resources/2010/W-K-Kellogg-Foundation-Evaluation-Handbook.aspx (accessed December 29, 2009).
[2] Luuk Engbers. “Monitoring and Evaluation of Worksite Health Promotion Programs – Current state of knowledge and implications for practice,” World Health Organization (2008) http://www.who.int/dietphysicalactivity/Engbers-monitoringevaluation.pdf (accessed December 29, 2009).