Evaluation

 

What is an evaluation?

An evaluation is the use of scientific method and rigorous and systematic collection of research data to assess the effectiveness of organisations, services and programmes in achieving predefined objectives. For health services, it is used to see if they fulfill their stated goals, targets or objectives. It is usually based on the collection of data about the structure, inputs, process, outputs and outcomes of the service as well as the appropriateness of the service (e.g. Vitamin D treatment has side effects for the severely deficient). Evaluation is more than audit because it seeks to record not only what changes occur but also what led to those changes.

Evaluation can be formative or summative. A formative (or process) evaluation involves the collection of data while the organisation or programme is active. The aim is to improve or develop the programme. A summative (or outcome) evaluation involves collecting data about an active or terminated programme or organisation. The aim here is to decide whether it should be continued or repeated, e.g. a screening programme or re-running a health promotion campaign for smoking if rates have ceased to decline. 

Difference between Monitoring Performance and Evaluation

Monitoring performance occurs during a service delivery  or delivery of a intervention.  It is usually set out in contracts as Key Performance Indicators (KPIs) or if conducting a project, you would decide in advance what indicators you would use to measure efficiency and effectiveness of a service or intervention in achieving its aims.  Achieving efficiency involves looking at whether the inputs are appropriate to the outputs and effectiveness refers to whether or not the objectives are being achieved.  Evaluation will also look at the elements of efficiency and effectiveness but has the added value of impact.
When you are monitoring performance you are looking at things in action and seeing if you are getting value for money and you have the ability to trouble shoot should things be going astray.  With an evaluaton, things have really happened and you're looking to see if the actions taken led to the desired outcome.  For example, evaluating whether or not GP practice visits led to an improvement of uptake of child 'flu vaccine.  In this instance, the monitoring of performance would be where action plans were agreed with the GP practices, than the public health commissioners would seek evidence that the actions had been carried out.  

Note on 'outcome' and 'indicator'

A health outcome is defined as the change or lack of change when change is expected - in health, health related status or risk factors affecting health. Outcomes may be attributable to medical interventions or they may be the result of the natural history of the condition. Considerations of case mix, bias and confounding factors are central to the appropriate interpretation of indicator values. To access an outcome, you will need an adequate description of the service, the underlying level of risk, frequency of outcomes and the data collection needs to be feasible. When measuring outcomes, make sure you include how they are driven (e.g. risk factors, genetics) and case-fatality is influenced by case mix and quality of care.  An indicator is an aggregated statistical measure describing a group or whole population, complied from measures on individuals, which provides insights about the functioning of services. Indicators will not necessarily provide definite answers on whether the services are good, adequate or inadequate but when well chosen, they should be capable of providing pointers to where further investigations may be worthwhile. An example of an indicator is in cancer, where screening can be a proxy measure for outcomes - what % are you covering? What % is being picked up?

Donabedian Method

The evaluation of health services is usually based on the collection of data about the structure, inputs, process, outputs and outcomes. This method is credited to Donabedian. 

  • Structure refers to inputs and resources into the services, e.g. buildings, funding, equipment, staff and hospital beds. It is the organisational framework for the activities that happen within the health service(s). Process on the other hand, is the activities themselves, such as referral patterns, admission procedures and prescribing of drugs. 
  • Process is important to the staff as staff are directly involved in it. (Indeed, sometimes staff get so got up in the process that it is difficult for them to see if the outcomes are being met). Additionally, data collected about process (often by an audit) can tell us how a service is organized, delivered and used. This can show how accessible the service is and whether scare resources are being used efficiently. So data can be used as performance indicators.
  • Outputs are the productivity side of things, such as length of hospital stay, waiting times, discharge rates, patient-professional contact, access, effectiveness and equity). 
  • Outcomes refer to the effectiveness of activity/activities. It is measured by mortality and morbidity rates, complication rates, disability, quality of life and patient satisfaction. In other words, the impact upon the patients and communities. A good classification of outcomes is Lohr's (1988) 5 D's of outcome: death, disease, disability, dissatisfaction and discomfort. Often, quantitative (e.g. survey or document analysis) and qualitative data are collected about process and structure in order to investigate how the outcome was caused by the activity.  

Measuring Quality of Services

When evaluating services, you will need to look at the quality of service provision.  There are a number of different criteria that you can use that set out which areas you should collect information on.  The following are three of the most common ones that I have come across.  

1. Maxwell (1984) Criteria

    • Social acceptability (how is it viewed by users?)
    • Effectiveness (does it work?)
    • Efficiency (can it produce same outcome with fewer inputs?)
    • Relevance to Needs (appropriateness)
    • Equity (fairness)
    • Accessiblity (can services be reached by the population?)
    • Locality

2. NHS Performance Assessment Framework

    • Fair Access (measured by equity)
    • Effectiveness (measured by evidence based medicine)
    • Efficiency (measured by finance)
    • Patient/carer Experience (measured by complaints/survey)
    • Health Improvement (measured by public health)
    • Health Outcome (measured by audit)

3.Quality Management System (QMS)

This is used quite a lot in the retail and manufactoring industry is now being used in realtion to health care services across Europe. Sometimes termed QMS-H, QMS focuses on using patient/user feedback and experience as a way of improving services and seeks to improve effectiveness of treatments and increase patient satisfaction with the health service in question. QMS is broadly defined as 'all the procedures explicitly designed to monitor, assess and improve the quality of care'. Evidence that can be assessed include peer review, patient satisfaction surveys, complaints handling, audits, policies and procedures and accreditation against explicit standards, European Quality Awards based on the model of the European Foundation of Quality Management (EFQM) and certification using ISO standards (ISO 9000 series).

Steps in Evaluating an Intervention or Service

  • Background and state epidemiology of the problem
  • State objectives
  • Select dimensions to be evaluated - look at Donabedian, quality of services
  • Define population to be studied, interventions to be looked at and outcomes to be measured
  • Select study design (experimental/RCT/case control/cohort/ecological/descriptive)
  • Select outcome measures to use: 
    1. Disease measures (objective) - death, morbidity; 
    2. Patient measures (subjective) - functional status, health-related quality of life (SF-36), disease specific (AIMS for arthritis), site specific (e.g. post hip replacement); 
    3. Economic measures - costs (direct: staff, drugs, buildings; indirect: loss of earnings, travel to hospital; intangible: costs of pain and grief);
    4. Acceptability measures - focus groups, interviews, satisfaction surveys and equity measures
  • Decide when and how to collect the data
  • Get evaluating!

Evaluating Partnerships

In public health, we often rely on partnership work in order to achieve public health outcomes such as a reduction in childhood obesity or improved oral health as public health outcomes are rarely the sole responsbility of one organsation.  The Evaluation concepts of efficiency, effectiveness and impact are the same for evaluating partnerships as for evaluating services or interventions but we also look at relevance and sustainability.

When reviewing/evaluating partership work, you will need to look at membership and roles as well as process and functioning of the partnership group. 

  1. Assess the composition and reach of the partnership - so look at expected roles versus actual roles, purpose of partner being there, skills and expertise etc
  2. Does the partnership have clearly defined objectives, goals and governance strucutres? (i.e. who do they report to?)
  3. What initiatives have the partnership undertaken?
  4. What were the outcomes? 
  5. Did the partnership achieve the objectives they set for themselves?
  6. Does the partnership make good use of partners' time and achieve better results than if the partnership didn't exist? 

There are a number of partnership assessment tools available on the web that you can use but it is always good to use a mixed methods approach - quantitative with qualitiative (such as interviews with partners) to get deeper insights.  Examples of tools include:

https://www.cdc.gov/dhdsp/docs/partnership_guide.pdf

http://wilderresearch.org/tools/cfi/index.php 
This is a free tool that you can use to assees how well your partnership is collaborating together.

Collaborative Partnerships Evaluation Tool  

Further reading

Bowling, A. Research Methods in Health, 2nd Edition. Open University Press, 2002.