Participation - finding out what difference it makes
Big question 9: What happens next?
How is the information from the evaluation collected and made sense of? Let’s have some ‘for instances’ of changes. How will we get feedback? Who owns these findings and what will happen as a result of them?
It is important to consider who is responsible for the evaluation, who decides what will happen to the findings and how people are going to get to know about them. It should be clear who will make sure that recommendations are acted on, so that the evaluation makes a difference. The results of the evaluation should be shared with the people who have been taking part. With their permission, the findings can be shared with other networks of services users and carers so they can learn from the experience, too. However, people’s privacy needs to be respected. Participation should not come to a finish with the evaluation, but can carry on in the way that the findings are shared and acted on.
Findings box 9
- Findings need to be participative (Toolkit 7), be presented creatively and in ways that are relevant, interesting and visible to the audiences for whom they are intended (Site 5) (Site 6).
- Feedback to people who have taken part is crucial to prevent cynicism (R02) and to maintain interest (Toolkit 2). People should be involved in deciding how feedback will be provided (R06).
- Who has the responsibility to make sure that the findings from the evaluation will be implemented? Who will make sure that you continue to find out the difference participation of service users and carers is making?
- How might wider publicity be given to the findings, for example through websites (Site 4), so that others can use your experiences of evaluation (R09)?
- What are the implications of the evaluation? Are there broader, political implications?
- Research suggests that a properly resourced national user-led network could support the networking that is crucial for positive participation (Branfield et al, 2006).
- A third of the initiatives in one review (R04, p3) were not providing any feedback.
Ideas box 9
How might evaluation needs and methods differ depending on where you are in the cycle?
(Adapted from Toolkit 4)