National review of community risk methodology across UK Fire and Rescue Service

Section

Good Practice

We have used the practice in these FRS submissions to inform our suggestions of leading practice. We use the term leading practice here as it aligns with the iterative nature of the evaluation process and the sector development needed in this area. Of the areas we explored in the survey, the evaluation evidence probably had the most variance.

Government evaluation guidance

Leading practice uses wider work (for example government guidance documents such as The Magenta Book or the New Economy Model) and articulates these to the context of the UK FRS. What we would highlight clearly here is that at the time of writing, these family of documents are skewed to economic costs and benefits surrounding identified risks. This has two notes of caution. Firstly, the economic cost of fires (in particular) needs updating, which we have argued elsewhere in this report. A second note of caution is that this excludes other costs and benefits which should be included such as social value or social impact. The UK FRS working within their role in the Emergency Services Collaboration agenda should start to commission academic work to develop methodologies which can not only take account of the different nature of costs and benefits for identified risks, but is sophisticated enough to disaggregate shared service costs and savings. However, the rest of this report will talk only to FRS evaluation when we discuss evaluation, in order to give the UK FRS clear guidance on their own current and leading practice.

Within the specific context of prevention and protection in the UK FRS this may involve all three aspects of the follow:

  1. assessing the extent to which the aims of a project/initiative/event (prevention work) have been achieved
  2. ensuring that the focus of prevention and protection work align with the aims and objectives of the FRS’s RMP
  3. ensuring any findings produced from evaluation contribute to the future development of the RMP

Principles of evaluation

Leading practice (such as West Midlands Logic Model) uses the following principles of evaluation:

Evaluation as a holistic approach: It is important to emphasise that evaluation is a process rather than a piece of work which takes place at the end of prevention and protection work. The nature of the prevention and protection work will dictate the aim of the initiative. The aim of the initiative will subsequently determine the appropriate and most practical method of data collection to assess whether the aim has been achieved.

Evaluation specificity and appropriateness

For any prevention and protection work the evaluation will require an element of pre-evaluation to determine the most appropriate targeting and tailoring methods for the prevention and protection work to use. This is to increase the likelihood of the work addressing the risk. The post evaluation involves the process of assessing whether the aim has been achieved, however the less detailed and accurate the pre-evaluation stage has been the less likely it is that the aim will have been achieved. In this manner specificity at pre-evaluation increases specificity post-evaluation.

Scope of evaluation work

Leading practice also evaluates all aspects of FRS work (protection, prevention, wider community safety initiatives, shared service work, and partnership work). Consequently two broad forms of evaluation are both included in leading practice:

  • The evaluation of prevention and protection work to determine the extent to which the aims of the prevention and protection work have been achieved. For example, reduction of ADF within a certain geographical area. If the aim has been achieved or not then different decisions need to be taken about the replication of that initiative in other areas if the same issue needs to be addressed. If the aim has not been achieved, then the evaluation should provide an indication of changes needed in the prevention and protection work or which alternative initiative/approach should be used instead (accepting this too will be evaluated). This is accepting the basic assumption that prevention and protection work which works on one occasion may not work on another occasion (for example the aim may be achieved in one particular area or community but not within another area or community). The evaluation should be designed to try and account for variances in external factors (such as location, the time of year, the day of the week etc) that are likely to influence achievement of the aim. By evaluating the success of different prevention and protection work, the evidence base builds to guide future direction and inform future RMP processes. Evaluation should not be limited to setting an aim and assessing whether it has been achieved. It should provide the opportunity for the FRS to be reflective about the strengths and weaknesses of their prevention and protection work in specific contexts that informs recommendations for the future.
  • The evaluation of all community safety prevention and protection work within a particular community/group/geography to determine whether their focus aligns with the overall community safety aims for that particular community/group/location. This is likely to be the main source of organisational development, as it will inform horizon scanning of local risks, particularly when considering the changing needs of a community over a three to five year period. This aspect of evaluation is less concerned with whether or not initiatives achieve their aims, but more concerned with whether the risks that the prevention and protection work are trying to address will enable the area to achieve its overall community safety aims. We saw from FRS submissions across the UK that the overall aims within a particular risk or geography will be partly driven by performance indicators. Performance indicators and performance boards should not be the main influence of evaluation work, but they should integrate and complement one another.

In-house evaluation teams

Submissions also provided a rich source of good and leading practice. Of note in the bands of good practice were FRSs (Tyne and Wear, West Midlands and Fire Scotland) who all have in-house teams developing toolkits or methodologies for evaluation of all prevention and protection work, or an evaluation team who also commission evaluation work from other bodies (Kent). This implies a resource implication that other FRSs might not be able to accommodate.