Skip to main content

Developing a proxy version of the Adult social care outcome toolkit (ASCOT)

Abstract

Background

Social care-related quality of life is a key outcome indicator used in the evaluation of social care interventions and policy. It is not, however, always possible to collect quality of life data by self-report even with adaptations for people with cognitive or communication impairments.

A new proxy-report version of the Adult Social Care Outcomes Toolkit (ASCOT) measure of social care-related quality of life was developed to address the issues of wider inclusion of people with cognitive or communication difficulties who may otherwise be systematically excluded. The development of the proxy-report ASCOT questionnaire was informed by literature review and earlier work that identified the key issues and challenges associated with proxy-reported outcomes.

Methods

To evaluate the acceptability and content validity of the ASCOT-Proxy, qualitative cognitive interviews were conducted with unpaid carers or care workers of people with cognitive or communication impairments. The proxy respondents were invited to ‘think aloud’ while completing the questionnaire. Follow-up probes were asked to elicit further detail of the respondent’s comprehension of the format, layout and content of each item and also how they weighed up the options to formulate a response.

Results

A total of 25 unpaid carers and care workers participated in three iterative rounds of cognitive interviews. The findings indicate that the items were well-understood and the concepts were consistent with the item definitions for the standard self-completion version of ASCOT with minor modifications to the draft ASCOT-Proxy. The ASCOT-Proxy allows respondents to rate the proxy-proxy and proxy-patient perspectives, which improved the acceptability of proxy report.

Conclusions

A new proxy-report version of ASCOT was developed with evidence of its qualitative content validity and acceptability. The ASCOT-Proxy is ready for empirical testing of its suitability for data collection as a self-completion and/or interview questionnaire, and also evaluation of its psychometric properties.

Background

In the UK, social care refers to long-term care services that aim to maintain the quality of life of adults with long-term health conditions or ageing-related support needs (for example, home care, day care or residential care). The increased demand for social care due to ageing populations in Europe combined, in some countries, with the trend to reduce public spending has contributed to an interest in the evaluation of the effectiveness and cost-effectiveness of social care based on the outcomes of people who use services [1]. The shift towards outcomes-based social care policy and administration in the UK has also been shaped by narratives of personalisation, increased choice and control for service users, and wider accountability and transparency in the use of public funds [2, 3].

To evaluate the effectiveness of social care based on personal outcomes rather than outputs, it is important to define the objectives of social care and to develop an appropriate instrument that reflects these objectives. Social care may be broadly conceptualised as services that compensate for the effect of impairments on overall wellbeing or quality of life [4,5,6,7]. Therefore, the evaluation of social care support requires the consideration of a broad range of quality of life attributes beyond health that are sensitive to the person-centred, compensatory activity of social care. It has been recognised that condition-specific instruments may not be suitable to assess the broader impact of social care support that is accessed by adults with a wide range of needs and also that generic health-related quality of life measures may not be sensitive enough to capture the outcomes of social care [8].

The construct of social care-related quality of life (SCRQoL) has been proposed as the basis for developing instruments to measure social care outcomes [7]. SCRQoL reflects aspects of quality of life that are important to people who use social care services and may also be conceptualised as the target of the compensatory activity of social care support [9, 10]. The Adult Social Care Outcomes Toolkit (ASCOT) instrument is a preference-weighted measure based on the construct of social care-related quality of life [10,11,12]. The psychometric properties of the ASCOT self-completion questionnaire (ASCOT-SCT4) have been established in samples of English, Dutch and Australian older adults [12,13,14]. The instrument has been recognised as a suitable outcome measure for the evaluation of adult social care services [13, 15] and has been used in local and national social care data collections and evaluation studies in England to inform policy strategy, commissioning and care practice [2, 16,17,18].

If individual quality of life is used evaluate the effectiveness of interventions and policy of social care in this way, a key challenge is how to collect quality of life data from people with cognitive or communication difficulties, who are unable to answer on their own behalf even with support, alternative formats or communication aids, so to avoid the issues of sampling bias and systematic exclusion from ‘having a voice’ [19,20,21]. In the evaluation of health care interventions using patient-reported outcome or experience measures, a widely-used method is the collection of data from someone who answers on behalf of the individual whose quality of life is to be assessed (by ‘proxy’). Despite its widespread use, it has been argued that data collection by proxy-report should only be used as a last resort when other methods are not possible because it takes away the individual’s opportunity to express their views [22, 23]. While the standard in quality of life measurement is self-report whenever possible, it is recognised that proxy-report is preferable to the systematic exclusion of individuals who are unable to self-report based primarily on the principles of equity and inclusion, as well as the potential methodological issues associated with missing data and bias [19, 20].

The aim of the study was to develop a new social care-related quality of life measure based on the ASCOT-SCT4 that could be used in circumstances where the individual is unable to self-report and other methods of eliciting outcomes information are not feasible. This article outlines the second phase of work that sought to apply the findings identified in the first phase of the project [24] to develop the content and format of the new instrument and to evaluate its content validity and acceptability.

Methods

This qualitative study aimed to develop a proxy-report version of the self-completion ASCOT-SCT4 with adequate content validity and acceptability to proxy respondents [10]. Content validity is a measurement property that assesses whether questionnaire items reflect the perspective of the population of interest (in this case, proxy respondents for adults who use social care services) and also whether the questionnaire format, wording and instructions are relevant, understandable and acceptable [25]. The study involved an initial phase to develop items and the questionnaire layout based on an earlier phase of research [21, 24] followed by cognitive interviews to refine the questionnaire and establish its content validity [25,26,27,28].

Questionnaire development

While recognising that proxy-report is not equivalent to self-report, the questionnaire development sought to draw upon the same principles and concepts as the ASCOT-SCT4 [10, 12]. The eight attributes captured by the ASCOT-SCT4 are: Control over daily life; Occupation (‘doing things I value and enjoy’); Social participation and involvement; Personal safety; Personal cleanliness and comfort; Food and drink; Accommodation cleanliness and comfort; and Dignity. The instrument has four response options for each item to distinguish between the ideal state, no needs, some needs and high-level needs in relation to each quality of life attribute [10] (see Table 1).

Table 1 ASCOT response levels (adapted from Netten et al. [10])

The key challenges of developing a proxy-report instrument were explored in an earlier phase of research, which involved a rapid literature review, focus groups and in-depth one-to-one interviews [21, 24]. In the English Adult Social Care Survey, the majority of proxy respondents are family or friends who live with the respondent or elsewhere and proxy-report by care staff is relatively infrequent (5.9% of proxy respondents in the 2014/15 ASCS) [29]. It should be recognised that some people may have no or limited contact with family or friends, so care staff may be called upon to act as proxy respondents, especially if they have frequent contact with individuals and in-depth knowledge of the person’s needs and preferences through their practice of care (for example, with support staff who work intensively one-to-one or with small groups of adults with learning disabilities). While some studies have found that proxy-reported quality of life varies by ‘type’ of proxy, for example, health or social care professionals compared to family [20, 30,31,32,33], other studies have not found significant differences by type of proxy [34,35,36]. There is also evidence that this may be accounted for by differences in proximity, intimacy and frequency of contact between the proxy and the individual rather than the ‘type’ of proxy per se [37, 38]. Therefore, in this study and also the earlier phase of research, we considered both care support staff and also family or friends as potential proxy respondents.

The key challenges in proxy response identified in this earlier phase of research were: (1) care workers’ resistance to the idea of acting as a proxy respondent; (2) the perceived difficulty of judging care recipients’ internal subjective state (i.e. how they ‘think’ and ‘feel’); and (3) proxy respondents’ wanted to express that their own response (as a proxy) differed from how they thought the care recipient would respond based on the proxy respondents’ judgement of the care recipients’ perspective [24]. Proxy respondents also commented that they found it difficult to differentiate between the top response options of the ‘ideal state’ and ‘no needs’ [24], whereas earlier work on self-report had found that the inclusion of the ‘ideal state’ supported comprehension and judgement of response [10] (see Table 1). Since the ‘ideal state’ represents that an individual’s needs and preferences are met to his/her desired level [10], it was hypothesised that this was a proxy-specific issue related to the proxy perspective adopted in formulating a response.

In response to the issues summarised above, the authors with advice from the ASCOT development team at the University of Kent (www.pssru.ac.uk/ascot), explored four significant variations in the format and content of the draft proxy version of the questionnaire. First, the questionnaire was formatted to allow the collection of proxy-ratings of quality of life for each attribute from the two proxy perspectives identified by Pickard and Knight [39]. The proxy-proxy perspective represents the proxy’s view based on their own preferences and values, while the proxy-patient perspective asks the proxy to answer based on the proxy’s best attempt at a reconstruction of the individual’s internal mental state based on their knowledge of the individual and their preferences. It was hypothesised that allowing proxy respondents to express both proxy perspectives may improve the acceptability of a proxy-report instrument, especially for paid carers. Furthermore, the specification of the proxy perspective within the questionnaire was anticipated to reduce the potential for bias that may arise if different types of proxy respondents (i.e. paid or unpaid carers), without clear instructions as to which proxy response to use, systematically refer to different proxy perspectives in order to formulate their response to the questionnaire.

The second modification was to include a comments box after each question. In the first phase of work, care staff reported that they would feel more comfortable with acting as a proxy if they had the opportunity to add further detail or explanation of their responses, especially if quality of life had been rated as some or high-level needs [24]. The inclusion of the comments boxes aimed to improve the acceptability of the questionnaire, especially for care workers.

Third, during the development of the original ASCOT-SCT4, it was found that some respondents understood the Dignity item to refer to the impact of having help on their self-esteem and self-perception, rather than the effect of the way they are supported or treated by care staff [10]. Based on this evidence, an additional item was developed for the ASCOT-SCT4 to allow respondents to express any difficulties they have with coming to terms with needing help [10]. This item is not considered in scoring the ASCOT-SCT4. In the development of the proxy-report questionnaire, by contrast, it was initially decided to include only one Dignity item because of the additional complexity of asking respondents to adopt two different proxy perspectives.

Finally, based on the proxy respondent’s report that they found it difficult to judge how the individual thinks or feels [24], the wording of some items was modified in the draft questionnaire to be more ‘objective’ (version 1.0): for example, Occupation was revised to refer to what the person ‘wants to do’ rather than ‘values and enjoys’, Social participation referred to any type of social activity rather than specifically to social interaction with ‘people s/he likes’ and the items for Personal safety and Personal comfort and cleanliness were modified to refer to ‘being’ rather than ‘feeling’ safe or clean and comfortable. In the first phase of interviews with the version 1.0 questionnaire, we sought to explore whether the use of more ‘objective’ criteria would improve the acceptability of the proxy-report questionnaire. The research team were, however, also aware that these modifications were in tension with the underlying conceptual basis of the ASCOT measure to capture social care-related quality of life with reference to the individual’s preferences, values and attitudes.

A summary of the process of the initial development and refinement of the questionnaire is shown in Fig. 1.

Fig. 1
figure 1

Flow diagram of the process of cognitive testing

Recruitment of participants

Participants were recruited purposively via three social care provider organisations across three local authorities in South East England. The study inclusion criteria were adult care workers or unpaid family carers (aged 18 years or over) in regular contact with an adult who uses social care services and who would be unable to answer the ASCOT-SCT4 on his/her own behalf due to cognitive and/or communication impairments. Potential participants were sent an invitation letter and information sheet by the care provider, which explained the purpose of the research. Those who were interested in participating were invited to contact the research team to arrange an interview and to answer any questions about the purpose or nature of the research. A total of 25 proxy respondents (13 care workers, 12 family carers) were recruited and completed a cognitive interview.

The study was approved by the national social care research ethics committee in England (reference: 13/IEC08/0020). Written informed consent was obtained from all participants prior to interview.

Cognitive interviews

All interviews were conducted between June 2015 and May 2016 by three trained qualitative interviewers (SR, GC, JC), two of whom had prior experience of cognitive interviews in the development of self-report questionnaires (SR, JC). The interviews lasted between 40 and 75 min and took place at a location convenient for the respondent – usually, in a private office or meeting room in the workplace or at the respondent’s home. The proxy respondents were asked to complete the ASCOT questionnaire using a think-aloud method of cognitive interviewing with concurrent follow-up probes [26]. Think-aloud requires the respondent to speak aloud their thoughts as they read the instructions and complete each item of the questionnaire. Following this, the interviewers used verbal probing to further explore the comprehension, clarity and relevance of each item and its response options (for example, ‘why did you choose this response rather than the level above/below?’) [26] (see Table 2 for further examples). The probes were used flexibly by the interviewers to explore the respondents’ understanding and ability to respond to the questions, as well as to identify any potential issues with the acceptability of item wording, format or layout. For some items, alternative versions were presented to assess the proxy respondents’ preferences and to evaluate whether the different content or format would affect the respondent’s comprehension, judgement or response. After the cognitive interview, the respondents were asked to reflect on their overall experience of completing the questionnaire with a focus on clarity and the acceptability of answering the questionnaire as a proxy respondent (see Table 3).

Table 2 Examples of cognitive interview probes
Table 3 Interview guide

Qualitative analysis

The interviews were audio-recorded and transcribed for analysis. The written interview transcripts were analysed using NVivo version 10 software [40]. While it is possible to informally analyse data from cognitive interviews to identify and resolve issues with item wording, layout or format, the use of systematic thematic analysis may provide a more rigorous way of analysing think-aloud cognitive interviews ([26] p.157). The researchers agreed on an initial analytical coding structure based on the framework of cognitive interviews as a technique to explore the understanding and acceptability of questionnaire item wording, format and layout (see Table 4) ([26] pp.164–7). The data was coded by two researchers (SR, GC). The coding structure was iteratively refined based on themes and subthemes that emerged from the data, which allowed the identification of key issues and insights.

Table 4 Coding tree

Iterative questionnaire refinement

The questionnaire was refined through an iterative process of cognitive interviewing ([26] p.146). Each round of interviews had a minimum of six and a maximum of twelve participants, which is within the guidelines suggested by Willis [26] (p.138). The researchers met during the fieldwork to review the data, to discuss any emerging themes, and to agree on any modifications to the questionnaire.

After six interviews with questionnaire version 1.0, the data were analysed to identify any issues with comprehension, response or acceptability that would justify modification of the questionnaire at that point. The edits were applied to generate version 2.0 of the questionnaire, which was tested in the next round of interviews (n = 7). The second interim analysis considered whether any further changes were required based on the interview data. The questionnaire with these changes is referred to as version 3.0, which was then tested in the final set of interviews (n = 12). During this final set of interviews, the research team agreed that we had reached saturation (i.e. no new concepts or issues emerged). At this stage, the third and final analysis was conducted of this final set of interviews, both separately and also alongside the two earlier phases.

A summary of the iterative development process is shown in Fig. 1.

Results

Sample characteristics

The demographic characteristics of the sample are shown in Table 5. Of the 25 respondents, 60% were female. The sample comprised care staff (n = 13) or family carers (n = 12). Half of the family carers responded on behalf of their spouse or partner (n = 6). The other carers responded on behalf of other family members (their adult children, parents or siblings). The majority of proxy respondents reported that the care recipient’s primary support reason was an intellectual disability (n = 9) or autism (n = 7). The remaining interviews were conducted with proxies who spoke on behalf of someone with dementia or another long-term condition that affected cognitive ability, comprehension or ability to respond.

Table 5 Characteristics of the cognitive interview participants (n = 25)

Cognitive interviews

With the iterative modifications outlined in detail below, the questionnaire items were found to be understandable and acceptable to care staff and family carers asked to act as proxy respondents. A summary of the findings in relation to the understanding of concepts and acceptability of each item to proxy respondents is presented in Table 6.

Table 6 Ease of understanding and acceptability of the ASCOT-Proxy items

Interim analysis and review (1)

Based on analysis of the first round of interviews, the modifications outlined in Table 7 were applied to the draft questionnaire. For Occupation and Social participation, it was found that the wording of the original ASCOT-SCT4 item or response options were preferred to the version 1.0 proxy questionnaire because they were perceived to be more person-centred (“looks at the individual”). One formal carer, for example, compared the two versions of the Occupation item, ‘what s/he values and enjoys’ (original ASCOT-SCT4) to ‘what s/he wants to do’ (version 1.0), as follows:

“I think that’s worded better… because it’s talking about her values and her enjoyment… When we’re looking at the people we support we’re looking at their values and what they like to do, their likes and dislikes - so I just think it’s easier to answer”. [GC_FC_01]

Table 7 Modifications based on respondent feedback (questionnaire version 1.0)

Because of this, the items of Occupation, Social participation, Personal safety and Personal comfort and cleanliness were modified to correspond to the original ASCOT-SCT4 item wording to improve acceptability and comprehension (see Table 7).

Based on the earlier research that explored care staff and carers’ views on the ASCOT-SCT4 questionnaire, the version 1.0 proxy questionnaire sought to avoid the use of ‘enough’ because of potential issues with comprehension or acceptability [24]. The first round of interviews identified that respondents found the alternative wording used in version 1.0 (‘It’s ok’) to be too informal and imprecise:

“I just don’t like how that looks, just remove the ‘it’s ok’… it’s too vague”. [SR_FC_02]

The first round of interviews explored whether the adaptation of the Personal safety and Personal comfort and cleanliness items to be more ‘objective’ (i.e. being rather than feeling safe) would improve acceptability of the measure to proxy respondents [24]. Interestingly, despite the finding of the earlier phase of work that some respondents were uncomfortable with rating the cared-for person’s internal subjective state, the respondents spontaneously answered the questions by attempting to construct the cared-for person’s subjective perspective based on their observations and knowledge of the individual. This was prompted by the questionnaire format of asking respondents to rate the two different proxy perspectives (proxy-proxy and proxy-patient).

“It took me a little bit longer to answer this one I thought, I don’t know whether you thought exactly the same but because obviously I’m trying to think how this individual is thinking, and also I'm thinking about what she does, and is that because she doesn’t feel 100 percent safe, and what does safe mean to her as well”. [GC_FC_01]

Based on this finding, the Personal safety and Personal comfort and cleanliness were adapted to correspond to the original ASCOT-SCT4, which refers to feeling rather than being safe or clean and presentable. In subsequent rounds of cognitive interviews, the respondents were able to understand and respond to these two modified items with no evidence of any issues with acceptability.

Interim analysis and review (2)

Based on advice from the ASCOT development team, the researchers asked respondents in the second round of cognitive interviews to compare two versions of the questionnaire (see Table 8). Cognitive interviews with a carer version of the ASCOT, the ASCOT-Carer, had identified that some respondents were not able to understand the word ‘adequate’ used in some response options [41]. The aim of comparing the two versions in this study, which used either ‘adequate’ or ‘enough’ in the response options for six of the eight items (see Table 8), was to further test the acceptability and comprehension of ‘adequate’ and also to determine whether the alternative item wording (‘enough’) would affect comprehension, judgement or response in the context of a proxy-report questionnaire. For each of the six items, the respondents were asked to respond to the version using ‘adequate’ before being asked to review the version using ‘enough’. The interviewer probed to explore whether the respondent preferred one version to the other, and also whether the different wording affected the respondent’s comprehension, judgement or response.

Table 8 Questionnaire version 2.0 modifications to response options

There was no clear preference between the two versions. Two respondents showed a consistent preference for ‘adequate’ [GC_FC_05; SR_FC_03], while three respondents preferred ‘enough’ [GC_FC_04; GC_FC_06; SR_IC_01; SR_IC_02] and one respondent indicated a preference for ‘adequate’ or ‘enough’ that varied by item [SR_IC_03]. The findings indicate that ‘adequate’ was understood by some respondents to be an objective baseline related to social services’ or other external agencies’ standards, while ‘enough’ implied a subjective standard.

“If you’re social services… they’ve got very different opinions of what’s adequate to bring a child up in a house. Most of us would say definitely not, but because there’s levels, there’s only concerns on certain levels, isn’t there? So adequate makes me feel that that’s what it’s looking at, whether it’s adequate. Whereas this is more about--, if you’re asking my opinion then it’s whether it’s enough for me”. [SR_IC_01]

While the intention of the ASCOT tools is to ask respondents to rate quality of life from a subjective perspective rather than an objective standard, it was decided to use the term ‘adequate’ rather than ‘enough’ to maintain comparability between the ASCOT-SCT4 and the proxy-report version.

Proxy perspectives

Pickard and Knight’s (2005) conceptual framework of proxy response was used to develop proxy questionnaire items that asked: (1) respondents to rate their own view (proxy-proxy perspective); and (2) what they thought the care recipient would say (proxy-patient perspective). This dual response approach was found to be acceptable to care worker and family carer proxies. The format allowed the respondents to express differences between their opinion and their understanding of the care recipient’s perspective. These differences were explained by reference to different personal preferences influenced, for example, by social or cultural norms or health condition-related factors:

“Her generation don’t do all this changing sheets every day malarkey and they don’t believe in so much washing. That’s just alien to them. And she used to say, “That’s fine, it doesn’t need a wash”. But to me it would be like reeking and I’d be like, “No, you must”. [SR_IC_10]

While the results suggest that the two proxy perspectives improved the acceptability of the questionnaire, some respondents were confused by the questionnaire layout with two columns that correspond to the two proxy perspectives (see Fig. 2). In the first two rounds of interviews, four of the thirteen proxy respondents answered the first question (Food and drink) by ticking multiple boxes in each column or writing yes/no to indicate whether each statement applied or not. They were, however, able to correctly complete the remainder of the questionnaire once they had realised that the format required one tick per column. Based on these findings, the questionnaire for version 3.0 was re-formatted to include ‘please tick one box’ above the response boxes (Fig. 2).

Fig. 2
figure 2

Questionnaire format

In the third round of interviews (n = 12), two family carers were initially confused by the layout. The interviewers had been briefed to allow the respondent to continue the questionnaire without prompts to highlight the questionnaire instructions to tick one box per column. In one of the two cases ([SR_IC_05]), the respondent spontaneously realised that their initial response with multiple ticks had been incorrect and amended their response. There were no issues with the remaining questions. In the other case ([SR_IC_07]), the respondent directly asked the interviewer whether they needed to tick multiple boxes in each column. Once they had seen the instruction, with a prompt from the interviewer, the respondent understood that only one tick per column was required to indicate their rating:

“Tick, ‘Please tick one box for each,’ oh, ‘Tick one box for each column.’ Oh I see, so [sigh], so it’s just the top one then isn’t it?” [SR_IC_07]

Dignity

The ASCOT Dignity item aims to capture the positive and negative psychological effects of formal support and care on the service user’s personal sense of significance [10]. In this study, it was found that the Dignity item was generally understood to relate to the effect of care workers' working practices and interpersonal interactions on the cared-for person’s sense-of-self. The interviews often focussed on interpersonal interactions associated with personal care:

“I would say dignity is like--, if someone is having a bath and you’re bathing them, like for instance they get out of the bath you turn away or you give them towels so they can hide themselves so they’ve, you know, privacy”. [SR_FC_02]

Other respondents, however, reflected on the wider implications of care practice and the quality of interpersonal interactions with care staff on the individual’s self-perception and dignity:

“I would say the quality of care is what results in dignity and respect… If the person who is going in to support them ultimately has that person’s--, as top most priority their needs, they will always think better of themselves at the end of that”. [JC_FC_03]

In one interview with a family carer, although the carer initially answered the question in relation to paid care, the boundary between the support given by family and paid carers became blurred:

“I would probably tick the top one and say he thinks and feels better about himself because we always do, we always praise him up. We always tell him he looks good”. [SR_IC_03]

To address this issue, the question format was modified to bold and italicize the reference in the question to paid care.

As observed in the development of the ASCOT-SCT4 [10], although less frequently, three proxy respondents (care worker (1), family carer (2)) spoke of the effect of needing help on the individual’s sense-of-self.

“Have I ever met anyone who has been quite positive and upbeat? I don’t think I ever have. I think if I did they most probably wouldn’t want our support … it’s the very act of receiving care that may make them feel undermined”. [JC_FC_03]

Importantly, however, the ‘think aloud’ indicated that the three proxy respondents made their final judgement and response based on the way in which the care recipient was treated by care workers, rather than the effect of needing help on the individual’s sense-of-self.

Discussion

The aim of this research was to develop a proxy-report measure of social care-related quality of life and establish its qualitative content validity through cognitive interviews [26]. This qualitative research showed that proxy respondents were generally able to respond to an adapted version of the ASCOT-SCT4 instrument designed to capture social care-related quality of life. The acceptability of the questionnaire was improved by the use of the two proxy perspectives proposed by Pickard and Knight [39] to allow respondents to express their opinion (proxy-proxy perspective) and distinguish this from their view of the care recipient’s own perspective (proxy-patient perspective). The findings of this study indicate that proxy respondents found it acceptable to adopt both proxy-proxy and proxy-patient perspectives and that they were able to understand and respond to the items based on both perspectives.

While an earlier phase of work found that care workers were hesitant to provide proxy judgements of individual care recipient quality of life especially when questions related to subjective concepts such as feeling safe rather than more objective judgements, such as being safe [24], this qualitative study indicates a high-level of acceptability of judging subjective states using the two proxy perspective conceptual framework [39]. Furthermore, the first round of interviews indicated that the item modifications that made the items less subjective in description (i.e. being rather than feeling safe) were less acceptable than items framed around the individual’s subjective perspective when proxy respondents were given a questionnaire based on dual proxy-patient and proxy-proxy perspective report. However, some proxy respondents still had difficulty in projecting themselves into the individual’s internal state when adopting the proxy-patient perspective, especially when the proxy respondent had to rely on ambiguous external cues and behaviours to formulate their response.

While the two proxy perspective methodology has improved the acceptability of proxy report of social care-related quality of life in the eight ASCOT-SCT4 domains [24], there remains the question of which source of information to use if the aggregate data is to be used for policy, administration and planning of social care services. The proxy-report framework proposed by Pickard and Knight [39] was developed in the context of health-related quality of life alongside a differentiation between the intra-rater gap between proxy perspectives and also the inter-rater gap between self-rated and proxy-rated quality of life. Indeed, there is an extensive literature that compares self-report and proxy-report, of which only a minority of studies have found no significant difference between self-report and proxy-report [42,43,44] or that proxy-report tends to overestimate quality of life compared to self-report [45]. The majority of studies that directly compare self-report and proxy-report have found an underestimation of quality of life by proxy respondents compared to patient self-report (for example, [20, 34, 38, 46,47,48,49,50,51,52,53,54,55,56,57,58,59,60,61,62,63,64,65]. While it is unclear whether these differences represent a systematic bias in proxy-report that may be extrapolated to cases where proxy-report is the only means of estimating an individual’s quality of life [19], it is evident that proxy-reported quality of life cannot simply be used interchangeably as a substitute for self-reported quality of life.

The two proxy perspectives may be conceptualised as two separate sources of information based on the proxy respondent’s mental construct of the care recipient’s likely internal state informed by their knowledge of the individual (proxy-patient perspective) or the proxy’s judgement based on their internal standards, preferences and attitudes (proxy-proxy perspective) [39]. Based on this view, it has been proposed that in the absence of a clear justification of one over the other, it would be informative to gather ratings based on both perspectives [21, 39]. When collecting data from both proxy perspectives, there remains the judgement as to how to use the two sets of data in analysing, reporting and applying the data in policy-making and service improvement. As such, while the qualitative analysis presented in this paper has shown that proxy respondents are able to rate both proxy perspectives and that this improves the acceptability of the questionnaire, further work is required to establish the psychometric properties of the data collected from both proxy perspectives and to explore how best to handle, analyse and apply the data.

The study has some limitations. First, the first phase of cognitive interviews included only care workers who supported adults with intellectual disabilities in a supported living context. In the subsequent two rounds of interviews, we purposively sampled a mixture of care workers and family carers because of evidence that different types of proxy respondent may respond differently to proxy-report questionnaires [66, 67]. We also sought to include carers or care staff who supported adults with other support needs and who lived in different support contexts to ensure that the questionnaire was feasible and acceptable to a range of proxy respondents. Second, the testing of the final version indicated that not all respondents were immediately able to understand the layout of the questionnaire. While there was no evidence of comprehension issues related to the two proxy perspectives, some respondents assumed that they had to tick all boxes in each column that applied rather than tick one box to indicate which statement best applies. This issue persisted even with a modification to include the instruction to ‘please tick one box for each column’ above the response options (see Fig. 2). Although it was found that respondents were eventually able to work out how to record their response to the questions, a pilot self-completion survey would be required to explore whether proxy respondents would be able to work through this without the presence of an interviewer in the context of a paper-based self-completion survey.

Conclusions

In conclusion, it was found that the item wording, format and layout of the ASCOT-Proxy questionnaire was understandable and acceptable to care workers and family carers invited to act as proxy respondents. Respondents’ comprehension of items corresponded to the construct definitions captured by the items in the self-report version of ASCOT-SCT4. This was improved by modifications to the ASCOT-Proxy to minimise differences in item wording between the ASCOT-Proxy and ASCOT-SCT4, whilst maintaining the acceptability of the instrument. The next step to evaluate the feasibility and psychometric properties of the new instrument in the context of self-completion surveys is justified.

Abbreviations

ASCOT:

Adult social care outcomes toolkit

ID:

Intellectual disability

LA:

Local authority

QoL:

Quality of life

SCRQoL:

Social care-related quality of life

SCT4:

Self-completion, four-level questionnaire

References

  1. Waldhausen A. Care services in crisis? Long-term care in times of European economic and financial crisis. Frankfurt: Institute for Social Work and Social Education; 2014.

    Google Scholar 

  2. Department of Health. Transparency in outcomes: a framework for adult social care. London: Department of Health; 2011.

    Google Scholar 

  3. Bovaird T. Attributing outcomes to social policy interventions: ‘gold standard’ or ‘fool's gold’ in public policy and management? Soc Pol Adm. 2012;48(1):1–23.

    Article  Google Scholar 

  4. Nocon A, Qureshi H. Outcomes of community care for users and carers: a social service perspective. London: Open University Press; 1996.

    Google Scholar 

  5. Qureshi H, Patmore C, Nichols E, Bamford C. Outcomes in community care practise. Overview: outcomes of social care for older people and carers. University of York: Social Policy Research Unit; 1998.

    Google Scholar 

  6. Bamford C, Qureshi H, Nicholas E, Vernon A. Outcomes in community care practice: outcomes of social care for disabled people and carers. University of York: Social Policy Research Unit; 1999.

    Google Scholar 

  7. Netten A, Beadle-Brown J, Caiels J, Forder J, Malley J, Smith N, Towers A, Trukeschitz B, Welch E, Windle K. Adult social care outcomes toolkit (ASCOT): main guidance v2.1. Canterbury: Personal Social Services Research Unit, University of Kent; 2011.

    Google Scholar 

  8. Forder J, Caiels J. Measuring the outcomes of long-term care. Soc Sci Med. 2011;73:1766–74.

    Article  PubMed  Google Scholar 

  9. Netten A. Overview of outcome measurement for adults using social care services and support. London: NIHR School for Social Care Research; 2011.

    Google Scholar 

  10. Netten AP, Burge P, Malley J, Potoglou D, Towers A, Brazier B, Flynn T, Wall B. Outcomes of social care for adults: developing a preferences weighted measure. Health Technol Assess. 2012;16(16):1–165.

    Article  CAS  PubMed  Google Scholar 

  11. Potoglou D, Burge P, Flynn T, Netten A, Malley J, Forder J, Brazier J. Best-worst scaling vs discrete choice experiments: an empirical comparison using social care. Soc Sci Med. 2011;72(10):1717–27.

    Article  PubMed  Google Scholar 

  12. Malley J, Towers A-M, Netten A, Brazier J, Forder J, Flynn T. An assessment of the construct validity of the ASCOT measure of social care-related quality of life with older people. Health Qual Life Outcomes. 2012;10:21.

    Article  PubMed  PubMed Central  Google Scholar 

  13. Van Leeuwen K, Bosmans J, Jansen A, Hoogendijk E, van Tulder M, van der Horst H, Ostelo R. Comparing measurement properties of the EQ-5D-3L, ICECAP-O, and ASCOT in frail older adults. Value Health. 2015;18:35–43.

    Article  PubMed  Google Scholar 

  14. Kaambwa B, Gill L, McCaffrey N, Lancsar E, Cameron I, Crotty M, Gray L, Ratcliffe J. An empirical comparison of the OPQoL-brief, EQ-5D-3L and ASCOT in a community dwelling population of older people. Health Qual Life Outcomes. 2015;13:164.

    Article  PubMed  PubMed Central  Google Scholar 

  15. Makai P, Brouwer W, Koopmanschap M, Stolk E, Nieboer AP. Quality of life instruments for economic evaluations in health and social care for older people: a systematic review. Soc Sci Med. 2014;102:83–93.

    Article  PubMed  Google Scholar 

  16. Forder J, Jones K, Glendinning C, Caiels J, Welch E, Baxter K, Davidson J, Windle K, Irvine A, King D, Dolan P. Evaluation of the personal health budget pilot programme. Canterbury: PSSRU, University of Kent; 2012.

    Google Scholar 

  17. Johnstone L, Page C. Using Adult social care outcomes toolkit (ASCOT) in the assessment and review process. Research Policy Planning. 2013/14;30(3):179–92.

    Google Scholar 

  18. Department of Health. The Adult social care outcomes framework 2015/16. London: Department of Health; 2014.

    Google Scholar 

  19. von Essen L. Proxy ratings of patient quality of life: factors related to patient-proxy agreement. Acta Oncol. 2004;43(3):229–34.

    Article  Google Scholar 

  20. Steel JL, Geller DA, Carr BI. Proxy ratings of health related quality of life in patients with hepatocellular carcinoma. Qual Life Res. 2005;14(4):1025–33.

    Article  PubMed  Google Scholar 

  21. Rand S, Caiels J. Using proxies to assess quality of life: a review of the issues and challenges. Canterbury: QORU, University of Kent; 2015.

    Google Scholar 

  22. Schalock RL, Brown I, Brown R, Cummins RA, Felce D, Matikka L, Keith KD, Parmenter T. Conceptualization, measurement, and application of quality of life for persons with intellectual disabilities: report of an international panel of experts. Ment Retard. 2002;40(6):457–70.

    Article  PubMed  Google Scholar 

  23. Verdugo MA, Schalock RL, Keith KD, Stancliffe RJ. Quality of life and its measurement: important principles and guidelines. J Intellect Disabil Res. 2005;49(Pt 10):707–17.

    Article  CAS  PubMed  Google Scholar 

  24. Caiels J, Rand S, Crowther T, Forder J, Collins G. Developing a proxy-report measure of social care-related quality of life. Canterbury: QORU, University of Kent; 2016.

    Google Scholar 

  25. Brod M, Tesler L, Christensen T. Qualitative research and content validity: developing best practices based on science and experience. Qual Life Res. 2009;18:1263–78.

    Article  PubMed  Google Scholar 

  26. Willis G: Cognitive interviewing: a tool for improving questionnaire design: London: sage; 2005.

    Google Scholar 

  27. Beatty PC, Willis G. The practice of cognitive interviewing. Public Opin Q. 2007;71:287–311.

    Article  Google Scholar 

  28. Frost MH, Reeve BB, Liepa AM, Stauffer JW, Hays RD. What is sufficient evidence for the reliability and validity of patient-reported outcome measures? Value Health. 2007;10:S94–S105.

    Article  PubMed  Google Scholar 

  29. NHS Digital. Personal Social Services Adult Social Care Survey, England - 2014-15, http://content.digital.nhs.uk/catalogue/PUB18642. 2005.

  30. Gil Z, Abergel A, Spektor S, Khafif A, Fliss DM. Patient, caregiver, and surgeon perceptions of quality of life following anterior skull base surgery. Arch Otolaryngol Head Neck Surg. 2004;130(11):1276–81.

    Article  PubMed  Google Scholar 

  31. Becchi A, Rucci P, Placentino A, Neri G, de Girolamo G. Quality of life in patients with schizophrenia--comparison of self-report and proxy assessments. Soc Psychiatry Psychiatr Epidemiol. 2004;39(5):397–401.

    Article  PubMed  Google Scholar 

  32. Rebollo P, Alvarez-Ude F, Valdes C, Estebanez C. Different evaluations of the health related quality of life in dialysis patients. J Nephrol. 2004;17(6):833–40.

    PubMed  Google Scholar 

  33. Hung SY, Pickard AS, Witt WP, Lambert BL. Pain and depression in caregivers affected their perception of pain in stroke patients. J Clin Epidemiol. 2007;60(9):963–70.

    Article  PubMed  Google Scholar 

  34. Schmidt S, Power M, Green A, Lucas-Carrasco R, Eser E, Dragomirecka E, Fleck M. Self and proxy rating of quality of life in adults with intellectual disabilities: results from the DISQOL study. Res Dev Disabil. 2010;31(5):1015–26.

    Article  PubMed  Google Scholar 

  35. Crespo M. Bernaldo de Quiros M, Gomez MM, Hornillos C. Quality of life of nursing home residents with dementia: a comparison of perspectives of residents, family, and staff. Gerontologist. 2012;52(1):56–65.

    Article  PubMed  Google Scholar 

  36. Gomez-Gallego M, Gomez-Amor J, Gomez-Garcia J. Determinants of quality of life in Alzheimer's disease: perspective of patients, informal caregivers, and professional caregivers. Int Psychogeriatr. 2012;24(11):1805–15.

    Article  PubMed  Google Scholar 

  37. Makai P, Brouwer WB, Koopmanschap MA, Nieboer AP. Capabilities and quality of life in Dutch psycho-geriatric nursing homes: an exploratory study using a proxy version of the ICECAP-O. Qual Life Res. 2012;21(5):801–12.

    Article  PubMed  Google Scholar 

  38. Graeske J, Fischer T, Kuhlmey A, Wolf-Ostermann K. Quality of life in dementia care: differences in quality of life measurements performed by residents with dementia and by nursing staff. Aging Ment Health. 2012;16(7):819–27.

    Article  Google Scholar 

  39. Pickard AS, Knight SJ. Proxy evaluation of health-related quality of life: a conceptual framework for understanding multiple proxy perspectives. Med Care. 2005;43(5):493–9.

    Article  PubMed  PubMed Central  Google Scholar 

  40. QSR International Ltd. NVivo qualitative data analysis software, Version 10. 2012.

  41. Rand S, Malley J, Netten A. Identifying the impact of Adult social care (IIASC): interim technical report. Canterbury: Personal Social Services Research Unit, University of Kent; 2012.

    Google Scholar 

  42. Gabbe BJ, Lyons RA, Sutherland AM, Hart MJ, Cameron PA. Level of agreement between patient and proxy responses to the EQ-5D health questionnaire 12 months after injury. J Trauma Acute Care Surg. 2012;72(4):1102–5.

    Article  PubMed  Google Scholar 

  43. Beadle-Brown J, Murphy G, Di TM. Quality of life for the Camberwell cohort. J Appl Res Intellect Disabil. 2009;22(4):380–90.

    Article  Google Scholar 

  44. Schiffczyk C, Jonas C, Lahmeyer C, Muller F, Riepe MW. Gender-dependence of substituted judgment on quality of life in patients with dementia. BMC Neurol. 2011;11:118.

    Article  PubMed  PubMed Central  Google Scholar 

  45. Muus I, Petzold M, Ringsberg KC. Health-related quality of life after stroke: reliability of proxy responses. Clin Nurs Res. 2009;18(2):103–18.

    Article  PubMed  Google Scholar 

  46. Edelman P, Fulton BR, Kuhn D. Comparison of dementia-specific quality of life measures in adult day centers. Home Health Care Serv Q. 2004;23(1):25–42.

    Article  PubMed  Google Scholar 

  47. Sloane PD, Zimmerman S, Williams CS, Reed PS, Gill KS, Preisser JS. Evaluating the quality of life of long-term care residents with dementia. Gerontologist. 2005;45 Spec No 1(1):37-49.

  48. Jonsson L, Andreasen N, Kilander L, Soininen H, Waldemar G, Nygaard H, Winblad B, Jonhagen ME, Hallikainen M, Wimo A. Patient- and proxy-reported utility in Alzheimer disease using the EuroQoL. Alzheimer Dis Assoc Disord. 2006;20(1):49–55.

    Article  PubMed  Google Scholar 

  49. Milne DJ, Mulder LL, Beelen HC, Schofield P, Kempen GI, Aranda S. Patients' self-report and family caregivers' perception of quality of life in patients with advanced cancer: how do they compare? Eur J Cancer Care (Engl). 2006;15(2):125–32.

    Article  CAS  Google Scholar 

  50. Naglie G, Tomlinson G, Tansey C, Irvine J, Ritvo P, Black SE, Freedman M, Silberfeld M, Krahn M. Utility-based quality of life measures in Alzheimer's disease. Qual Life Res. 2006;15(4):631–43.

    Article  PubMed  Google Scholar 

  51. Hoe J, Katona C, Orrell M, Livingston G. Quality of life in dementia: care recipient and caregiver perceptions of quality of life in dementia: the LASER-AD study. Int J Geriatr Psychiatry. 2007;22(10):1031–6.

    Article  PubMed  Google Scholar 

  52. Arlt S, Hornung J, Eichenlaub M, Jahn H, Bullinger M, Petersen C. The patient with dementia, the caregiver and the doctor: cognition, depression and quality of life from three perspectives. Int J Geriatr Psychiatry. 2008;23(6):604–10.

    Article  PubMed  Google Scholar 

  53. Zimmermann F, Endermann M. Self-proxy agreement and correlates of health-related quality of life in young adults with epilepsy and mild intellectual disabilities. Epilepsy Behav. 2008;13(1):202–11.

    Article  PubMed  Google Scholar 

  54. Huang HL, Chang MY, Tang JS, Chiu YC, Weng LC. Determinants of the discrepancy in patient- and caregiver-rated quality of life for persons with dementia. J Clin Nurs. 2009;18(22):3107–18.

    Article  PubMed  Google Scholar 

  55. Kunz S. Psychometric properties of the EQ-5D in a study of people with mild to moderate dementia. Qual Life Res. 2010;19(3):425–34.

    Article  PubMed  Google Scholar 

  56. Schiffczyk C, Romero B, Jonas C, Lahmeyer C, Muller F, Riepe MW. Generic quality of life assessment in dementia patients: a prospective cohort study. BMC Neurol. 2010;10:48.

    Article  PubMed  PubMed Central  Google Scholar 

  57. Jones JM, McPherson CJ, Zimmermann C, Rodin G, Le LW, Cohen SR. Assessing agreement between terminally ill cancer patients' reports of their quality of life and family caregiver and palliative care physician proxy ratings. J Pain Symptom Manag. 2011;42(3):354–65.

    Article  Google Scholar 

  58. Bruvik FK, Ulstein ID, Ranhoff AH, Engedal K. The quality of life of people with dementia and their family carers. Dement Geriatr Cogn Disord. 2012;34(1):7–14.

    Article  PubMed  Google Scholar 

  59. Whynes DK, Sprigg N, Selby J, Berge E, Bath PM. Testing for differential item functioning within the EQ-5D. Med Decis Mak. 2013;33(2):252–60.

    Article  Google Scholar 

  60. Claes C, Vandevelde S, Van Hove G, van Loon J, Verschelden G, Schalock R. Relationship between self-report and proxy ratings on assessed personal quality of life-related outcomes. J Policy Pract Intellect Disabil. 2012;9(3):159–65.

    Article  Google Scholar 

  61. Moyle W, Murfield JE, Griffiths SG, Venturato L. Assessing quality of life of older people with dementia: a comparison of quantitative self-report and proxy accounts. J Adv Nurs. 2012;68(10):2237–46.

    Article  PubMed  Google Scholar 

  62. Sheehan BD, Lall R, Stinton C, Mitchell K, Gage H, Holland C, Katz J. Patient and proxy measurement of quality of life among general hospital in-patients with dementia. Aging Ment Health. 2012;16(5):603–7.

    Article  PubMed  Google Scholar 

  63. Arons AM, Krabbe PF, Scholzel-Dorenbos CJ, van der Wilt GJ, Rikkert MG. Quality of life in dementia: a study on proxy bias. BMC Med Res Methodol. 2013;13:110.

    Article  PubMed  PubMed Central  Google Scholar 

  64. Yeaman PA, Kim DY, Alexander JL, Ewing H, Kim KY. Relationship of physical and functional independence and perceived quality of life of veteran patients with Alzheimer disease. Am J Hosp Palliat Care. 2013;30(5):462–6.

    Article  PubMed  Google Scholar 

  65. Zucchella C, Bartolo M, Bernini S, Picascia M, Sinforiani E. Quality of life in alzheimer disease: A comparison of patients' and caregivers' points of view. Alzheimer Dis Assoc Disord. 2015;29(1):50–4.

  66. Bryan S, Hardyman W, Bentham P, Buckley A, Laight A. Proxy completion of EQ-5D in patients with dementia. Qual Life Res. 2005;14(1):107–18.

    Article  CAS  PubMed  Google Scholar 

  67. Crespo M, Hornillos C, de Quiros MB. Factors associated with quality of life in dementia patients in long-term care. Int Psychogeriatr. 2013;25(4):577–85.

    Article  PubMed  Google Scholar 

Download references

Acknowledgements

We are very grateful to all who participated in the research, the three organisations who supported recruitment of care workers and family carers and also Nick Smith, Ann-Marie Towers and Juliette Malley who provided advice and comments on drafts of the proxy-report questionnaire.

Funding

The research on which this article is based was funded by the Department of Health and undertaken by researchers at the Quality and Outcomes of person-centred care Research Unit (QORU). The views expressed here are those of the authors and are not necessarily shared by the department.

Availability of data and materials

The raw data from the study reported in this article is not freely available because we do not have consent for publication of these data. The proxy-version of the Adult Social Care Outcomes Toolkit (ASCOT) is available upon request for non-commercial purposes from the ASCOT team (ascot@kent.ac.uk).

Authors’ contributions

SR contributed to the design of the study, collected and analysed the data, and drafted the manuscript. JC designed the study, collected the data and contributed to the draft of the manuscript. GC contributed to the design of the study, collected and analysed the data, and contributed to the draft of the manuscript. JF contributed to the design of the study and the draft of the manuscript. All authors have read and approved the final manuscript.

Competing interests

The authors declare that they have no competing interests.

Consent for publication

Written informed consent for publication of anonymised interview data in research summaries, reports, conference presentations and academic journal articles was obtained from all participants.

Ethics approval and consent to participate

The study was reviewed and approved by the national social care research ethics committee in England (reference: 13/IEC08/0020). Written informed consent was obtained from all participants prior to interview.

Publisher’s Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Stacey Rand.

Rights and permissions

Open Access This article is distributed under the terms of the Creative Commons Attribution 4.0 International License (http://creativecommons.org/licenses/by/4.0/), which permits unrestricted use, distribution, and reproduction in any medium, provided you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons license, and indicate if changes were made. The Creative Commons Public Domain Dedication waiver (http://creativecommons.org/publicdomain/zero/1.0/) applies to the data made available in this article, unless otherwise stated.

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Rand, S., Caiels, J., Collins, G. et al. Developing a proxy version of the Adult social care outcome toolkit (ASCOT). Health Qual Life Outcomes 15, 108 (2017). https://doi.org/10.1186/s12955-017-0682-0

Download citation

  • Received:

  • Accepted:

  • Published:

  • DOI: https://doi.org/10.1186/s12955-017-0682-0

Keywords