Development of an Evaluation Framework for Health Promotion and Improvement Practice

0
142
Dr Louise Tully

Health Promotion and Improvement (HP&I) was most easily described as going ‘upstream’, tackling the factors that impacted health before they manifested into chronic or co-morbid condition, Dr. Louise Tully, Health Promotion and Improvement Officer, Dublin and South East Region, told a HMI East Regional Seminar.

She said it was rooted in addressing health inequalities, and was underpinned by the Social Determinants of Health (SDoH) –a model conveying how environments in which we were born, grew, lived, worked and aged, and the systems surrounding these, played a major part in our risk of ill-health.

“In our region, as with all HSE regions, key components of health promotion practice are education, community action and the development of healthy public policy. Our evidence-based programmes support the aims of the Healthy Ireland (HI) Framework, our local Healthy Ireland Implementation Plan, and Sláintecare. We work across an array of policy priority programmes. Our work covers alcohol, healthy eating, physical activity, mental health, staff health and wellbeing, sexual health, in addition to cross cutting work (communications and stakeholder engagement), and local standalone projects. Because of that, the required skill set for HP&I is diverse, as is our team. Our HP&IOs come from backgrounds in: pharmacy, research, sports science, nutrition, nursing, and education, to name a few.

“All of the work that is nationally funded and implemented is done so on the basis of strong supporting international evidence and usually with a national evaluation to support local roll out. However, while data relating to key performance indicators (e.g. programmes delivered, tobacco clients seen) are reporting requirements, there is less guidance or structure in place for evaluating our own practice.

“All of this means that there is a spectrum of both confidence and capacity among our team to design, carry out and interpret data for evaluating our practice. Thus, the project of developing an evaluation framework was conceived. This project aimed to develop a framework and toolkit for evaluating our health promotion and improvement practice in HSE Community Healthcare East. Upon seeking an external consultancy to design this, we set out the following criteria:

  • A framework against which we could plan the most appropriate method to answer the most appropriate questions.
  • Aligned to the values of the discipline of health promotion.
  • Co-designed/in consultation with HP&I staff (as outlined in the HSE Change Guide).
  • Translatable across existing programmes and to new programmes/ projects as they arise.
  • Accessible for staff and partner stakeholders.

“We also specified that it would incorporate rigorous/best practice guidance in relation to research/evaluation, considering aspects like patient and public involvement (PPI), or ‘service-user engagement’ as it is referred to within our organisation. This brief attracted a number of proposals, and after careful consideration, we set about work with S3 Solutions, whose approach entailed four stages.

(i) Desk review

“This provided a holistic overview of the range of activities and services, identifying common themes and areas of focus. This included an in-depth assessment of the current HP&I work programmes, including an analysis of programmatic data, monitoring reports, previous and recent evaluations and funding proposals. Notably, we already did lots of reporting, however this was not consistent (various programmes required slightly different metrics) and not necessarily aligned to our own evaluation interests.

(ii) Evaluability in Principle – Logic model development

“This was a major project phase, and involved identifying (a) the problem each category of work aimed to address, the activities undertaken, and the intended change, and (b) the indicators that would evidence the extent to which the change has occurred. In total, 12 facilitated workshop discussions and six individual meetings took place with HP&I staff to establish conceptual evaluation frameworks.

(iii) Scoring and prioritising indicators

“‘Evaluability in Practice’ seeks to identify any potential barriers to data collection that would impinge on the successful and reliable rollout of an evaluation framework. To keep a realistic approach to evaluation, the short-term outputs and longer-term indicators were tested. Scoring and prioritisation exercises were undertaken with staff, to assess each identified indicator for accessibility and value, with a scoring matrix used to categorise ‘could’, ‘should’ and ‘must’ collect, based on “these.

These steps then formed the basis of an overarching framework, bespoke to our work, which used a combination of the PRISM RE-AIM framework and the COM-B behaviour change models. The domains of our framework include reach, experience, impact, maintenance and value. This provides a scaffolding for evaluating all existing and new programmes of work.

“Finally, in addition to the above steps, and in conjunction with our local servicer-user engagement officer, we consulted clients engaged with various health promotion activities in our community. This was aimed at ensuring that our proposed evaluation approaches were appropriate and made sense to the end user. A number of key considerations emerged from these consultations, including:

  • Caution with sensitive demographic data – building rapport and explaining the purpose of these.
  • Understanding the service-user’s own goals or measures of success.
  • The impact on the wider community – e.g. caregivers, families.
  • Sensitivity around literacy and preferences relating to the feedback/evaluation medium.
  • A sense of ownership.”

Dr. Tully said that as a department, they were now in a strong position to systematically assess their work, look for areas of improvement and indeed systematically demonstrate what they were are achieving. “Within our role, often we are one step removed from the service-user or client, and understanding our impact can be difficult. However, service evaluation is a key component of any health care delivery and understanding what we need to measure, while fundamental, was important to get right. This year, we are focussing on fully implementing this framework to our work, beginning with aligning our data collection across all areas. This will be followed by the challenge of delicately balancing robust data collection and importantly interpretation/reporting, with keeping the administrative burden on staff manageable. Streamlining our processes and ensuring efficiency will be a major task involved in this implementation phase, but will ultimately meet the broader goal of the project as a whole.”