Commissioning and monitoring of Independent Mental Capacity Advocate (IMCA) services

Assessing the quality of the IMCA service

Commissioners will want to evidence the quality of IMCA services: both when a provider is commissioned and when it is providing the local service.

Particular challenges in assessing the quality of IMCA services include:

Indicators of quality can be grouped into two areas:

1. Quality of the service provider:

2. Quality of the IMCA service provided:

More information about these is provided below.

External quality marks

IMCA providers working towards or having achieved external quality marks is also considered widely to be a reliable indicator of the quality of the service. The potential quality marks include the Community Legal Services Quality Mark, Investors in People, PQASSO and ISO 9000.

A specific advocacy quality mark was launched by A4A in 2008. This was developed in response to the need for advocacy schemes to demonstrate their quality and the limited 'fit' of existing quality marks. Information about the Advocacy Quality Performance Mark can be found on A4A’s website. The quality mark now has an option for an IMCA specific review. The quality indicators used are freely available. This allows commissioners to use them directly to support the assessment of the quality of local advocacy schemes.

Feedback from people making instructions

Probably the most commonly used quality indicator is feedback from those people who instructed the IMCA service. This is widely considered to be a valid indicator of quality. Usually this takes the form of a questionnaire being sent to these people at the end of an IMCA 's work with an individual. Rates of return of these questionnaires have been a difficulty in many areas. In a small number of local authorities, commissioners have directly contacted a proportion of these people for feedback about the IMCA service. Typical questions asked include the following:

There is a risk of a conflict of interests in relation to the decision makers' view of the IMCA service and the outcome for the service user. For example, while a decision-maker may rate an IMCA service positively for not formally challenging their decision, this does not necessarily mean that the IMCA service has been of a high quality.

Further suggestions are given in Appendix 4.

Using IMCA reports to identify quality

IMCA reports are a useful focus for identifying the quality of IMCA services. Good IMCA reports will, for example, demonstrate the following concerning the IMCA 's work:

A focus on IMCA reports is consistent with the common use of such reports to demonstrate that IMCAs meet the learning outcomes required for the IMCA units of the national advocacy qualification.

Commissioners may want to consider ways of sampling and quality-checking IMCA reports which do not compromise the confidentiality of individuals. There is a potential role for steering group members here.  Action for Advocacy’s report writing: best practice guidance is a useful tool.

Sampling could cover the following instructions:

Further suggestions are given in Appendix 5.

Using service user views as a focus for quality

As identified above, there are particular challenges to collecting feedback from the service users on the quality of the IMCA service they received. It can be more useful to focus on any views they have about the outcome of the process for which the IMCA was instructed to support and represent them.

The service user’s views (including those expressed at an earlier time) can be categorised in the following way in relation to the outcome of the process:

Nationally, only in about 8 per cent of cases does the outcome go against the service user’s expressed wishes. The outcome in these situations may of course be in the person's best interests.

Focusing on these specific cases is a potential way to include the views of service users in the monitoring of IMCA services. This is because it could be assumed that, were these people to have capacity, they would be dissatisfied with the IMCA service's representation of their views.

Commissioners may want to consider asking IMCA providers to identify instructions where the outcome went against the views of the individual and to provide information about the following:

In some cases it may be possible to gain further feedback directly from the individual.

Another potential focus for quality related to client views is IMCA instructions for accommodation decisions where the person was living in their own home. The national figures identify that only a very small proportion of people in this situation had the opportunity to continue living in their home, with the vast majority moving to a residential care home. Even if the person was unable to express a view it could be the case that they would have wanted all possible support to remain in their own home. The IMCA s’ work could be examined in these cases to see if they ensured that alternatives to moving to a care home were adequately explored. This would include making full use of the opportunities of personalisation.

Further suggestions are given in Appendix 6.

Measuring the difference the IMCA service has made


There is some concern that monitoring information is too much focused on the process of IMCA involvement rather than its outcomes. One development is to look at the difference IMCAs have made to:

Looking at the difference the IMCA service makes, challenges IMCAs to consider how they can best use their time to support improvements for individuals. One way this can be done is by providing a summary of a person’s needs and wishes to a care home which someone has moved into. This could help the person receive a personalised service in the care home (see ADASS/SCIE IMCA guidance for accommodation decisions and care reviews). Looking at the difference may also mean IMCAs limiting their involvement in those cases where they may have little or no impact.

The Norah Fry Research Centre is currently undertaking research into the difference IMCAs make in this area. This may help provide bench marks for the future monitoring of IMCA services and a reliable way to compare quality between different providers.

Appendix 7 set out the areas examined by the Norah Fry Research Centre. They are included as headings for monitoring data in the example service specification (Appendix 1).