Introduction

Although cancer is rare in adolescents and young adults (AYA), it is the leading cause of death in this age group after unintentional injury, homicide and suicide [1]. Survival following a cancer diagnosis in AYAs (defined as ages 15–39) has shown little improvement relative to other age groups [2]. In particular, cancer patients diagnosed between the ages of 25 and 35 have had no survival improvement in more than 20 years [2]. Although scientific evidence to understand reasons for this phenomenon is limited, experts have speculated that the lack of improvement in survival may be due to a combination of factors, such as the lack of available clinical trials hampering efforts to develop novel therapies,[3] lack of medical insurance, poor access to medical care for the initial diagnosis and for follow-up care, attitudes of invincibility, physicians’ low suspicion of cancer resulting in delayed diagnosis, and location and specialty of treating physicians [4].

To address several of these issues in AYA cancer patients, the National Cancer Institute (NCI) in 2007, in partnership with the Lance Armstrong Foundation (LAF), initiated an observational cohort study to: 1) examine factors associated with high quality cancer treatment in general community practice, including the use of clinical trials and treatment protocols, and 2) assess patient-reported outcomes, such as health-related quality of life, unmet needs, and the impact of cancer on psychosocial domains. The design and implementation of the Adolescent and Young Adult Health Outcomes and Patient Experience (AYA HOPE) Study was motivated and guided by the findings of a systematic review of the science on AYA cancer patients [5].

Here we report on the feasibility of 1) recruiting AYA cancer survivors, 2) developing and fielding a patient survey with a subsequent survey to examine changes over time and 3) obtaining patient (or guardian) consent to review medical records. We focus on the practical challenges in recruiting and conducting research in AYA cancer survivors identified through population-based registries across the United States, and discuss potential strategies to increase recruitment of these survivors.

Methods

Study setting and eligibility criteria

Patients were identified through seven population-based Surveillance, Epidemiology and End-Results (SEER) program cancer registries: Detroit; Seattle/Puget Sound; Los Angeles County, San Francisco/Oakland, Greater California (13 counties around Sacramento plus Orange County), and the states of Iowa and Louisiana.

Conduct of this study required approval from 9 Institutional Review Boards (IRBs) (7 registries, 1 state (California) and the NCI IRB), which took 7 months. Several registries experienced particularly long approval processes because the inclusion of minors required changes to consent forms. No concerns were raised by the IRB about survey content, although there was concern about security of the online survey that was resolved with a more complete explanation of the security protocols. The online survey website included a firewall, Virtual Private Network (VPN), an intrusion detection system, and routine security checks of the computer resources. Patients were given a website address, user name and password in the initial mailing. Patients accessing the online survey were required to create a new password upon entering the system. This process made the survey accessible only to the person with the new password. Once the survey was submitted no further access was allowed.

Eligible patients were diagnosed between July 1, 2007 and October 31, 2008 with a first invasive, histologically confirmed germ cell cancer, non-Hodgkin lymphoma (NHL), Hodgkin lymphoma (HL), acute lymphocytic leukemia, Ewing’s sarcoma, osteosarcoma, or rhabdomyosarcoma (Appendix I), ages 15 through 39 years at diagnosis, residents of the study area and able to read English. The goal was to obtain 530 surveys completed 6–14 months from the date of diagnosis. A small sample of deceased patients who were otherwise eligible was included in the medical record review.

Data collection instruments

The survey (http://outcomes.cancer.gov/surveys/aya/), which took approximately 20 min to complete, was designed to be self-administered and address a number of issues including the impact of cancer, health-related quality of life, healthcare delivery, and reasons for not participating in clinical trials. Survey development included cognitive and usability testing with 24 AYA cancer survivor volunteers.

Patients were asked to complete a healthcare provider form by listing the names and addresses of all healthcare providers and facilities providing care. The information was used to supplement registry information and to request medical records for data abstraction.

Patient recruitment

Before patients were contacted, the patient’s physician was notified that his/her patient would be approached to participate in the study. Physicians were asked to advise study staff if there were reasons the patient should not be contacted. If the staff had not heard from the physician in 3 weeks, they contacted the patient. Some physicians and one state registry required active physician consent.

Eligible patients were mailed a cover letter, a brochure introducing the study, an overall study consent document (where required by the registry’s IRB), healthcare provider form, a Health Insurance Portability and Accountability Act (HIPAA) compliant consent for the release of medical information, a paper survey, a web address for the online survey, a pre-addressed stamped return envelope for study materials, and a LIVESTRONG bracelet. If the patient was a minor at the time of contact, study materials were mailed to the parent/guardian and included a consent document requiring signatures from the parent/guardian and a signature of assent from the minor.

Follow-up procedures for non-respondents included a second mailing 3 weeks after the initial mailing, followed by telephone calls from trained interviewers beginning 2 weeks after the second mailing. If needed, multiple follow-up calls were made to ask the patient to complete the survey over the telephone and mail the medical record release and the healthcare provider form.

If study materials were returned marked undeliverable, the staff employed a variety of tracing procedures to obtain the correct address, including reviewing the cancer registry database for updated information, contacting directory assistance, obtaining information from healthcare providers, hospital(s) and family members. Study sites also used website directories (e.g., Google, Zabasearch), paid directory services (e.g., Accurint, Coles) and/or public records (e.g., Department of Motor Vehicles, Voter Registration). A few study sites sought new addresses by exploring social networking websites. If a new address was not found, the staff attempted to contact the individuals by telephone. If the interviewer received an answering machine a message was left; after leaving 2 messages no additional messages were left. Phone calls were made at different times during the day, evening and weekends. If none of these efforts was successful, the patient was classified as lost to follow-up.

Patients who completed the survey were sent a thank you letter and $25.00 for the time spent completing it. An additional $25.00 was provided for time spent completing the healthcare provider and medical record release forms.

Medical record retrieval

Medical records were collected from facilities listed on the healthcare provider form. Information obtained included type of healthcare facilities, physician subspecialties, tumor characteristics and staging, diagnostic procedures, participation in active clinical trials, therapy provided, and comorbid conditions.

Follow-up survey

We also conducted a follow-up survey, designed to be self-administered by paper or online, to examine changes in psychosocial outcomes and health-related quality of life 15–35 months following the initial diagnosis. Survey development (http://outcomes.cancer.gov/surveys/aya/), tracing and follow-up was similar to the methods used for the initial survey. Participants were provided with $50 for time spent completing the survey. No additional medical record review was performed.

Analyses

Descriptive univariate analyses were conducted to compare demographic and tumor characteristics of baseline survey respondents and non-respondents based on data collected by each SEER registry. We used the patient’s address at the time of cancer diagnosis to determine area level educational attainment and median family income at the census tract level [6]. A multiple logistic regression model was used to assess the association between clinical and non-clinical factors and participation. Survey participants were compared by survey mode, and follow-up survey completion. A significance level of 0.05 (two tailed) was used for all analyses.

Results

A total of 1,405 patients were identified as potentially eligible (Fig. 1). Among registries that required physician consent, only 6 physicians refused to allow their patient to be contacted. Physicians identified 11 patients as deceased and 9 as ineligible. Another 70 patients became ineligible because they exceeded their 14-month eligibility date prior to contact. After mailing the survey (n = 1,309), we identified 16 deceased patients and 85 patients ineligible because they did not speak English (56%), they denied having cancer (9%), or due to other reasons (34%).

Fig. 1
figure 1

Patient number flowsheet for the study

Response rates

A total of 524 eligible patients completed the initial survey and 1 patient completed only the medical record release (Fig. 1), yielding a response rate of 43%. Of patients sent the initial mailing, 87% required additional contact.

Of the refusals, 133 directly declined participation when contacted and 341 patients never completed the surveys despite repeated contacts (Fig. 1). Of patients lost to follow-up, telephone numbers were found for 126, but several calls to these numbers did not produce a mailing address. For the remaining 83 patients, neither a valid telephone number nor an address could be found.

Generalizability of the enrolled sample

We compared the characteristics of patients who participated in the survey to non-responders (Table 1). In univariates analyses, non-respondents were not significantly different from responders by age, census tract education, or census tract median family income. Response did vary by cancer site (p < 0.04) from 38% for acute lymphocytic leukemia and sarcoma to 51% for HL. Females were more likely to participate (p < 0.0001). Non-Hispanic black (p < 0.05) and Hispanic (p < 0.001) patients were less likely to respond than non-Hispanic whites. Among non-respondents, non-Hispanics whites and non-Hispanic Blacks were more likely to refuse (p < 0.001) while Hispanics (32%) were more than twice as likely to be lost to follow-up as non-Hispanic white (12%) or non-Hispanic black (14%) patients (data not shown).

Table 1 Percent distribution of demographic characteristics for respondents and non-respondents

In a multivariate logistic regression model, we investigated factors thought to be associated with response rate (Table 2). Females were more likely to answer the survey than were males (p <0.001). Compared to non-Hispanic whites, Hispanics and non-Hispanic blacks were less likely to respond. Other factors in Table 2 were not significantly associated with participation.

Table 2 Multivariable analysis examining characteristics associated with response vs. non-response to the AYA HOPE Study Baseline Survey

Response by survey mode

Some patients chose to complete the survey online (22%) or over the telephone (2%) (Fig. 1 and Table 3). Compared to patients in other age or racial/ethnic groups, fewer 20–24 year-olds or non-Hispanic black patients and less than 10% of sarcoma patients completed the survey online. Patients with an Associate Degree or higher or who lived in higher socioeconomic status census tracts, defined by education and income, were more likely to complete the survey online.

Table 3 Mode of survey completion by patient characteristics, AYA HOPE study

Medical records

Among participants completing the survey, 5% refused to sign a medical records release consent form (Fig. 1). Medical records were obtained for 93% of all respondents. However, we were unable to obtain medical records for 2% of consenting participants because the records were lost within the healthcare facility or the facility failed to provide them after multiple requests. Medical records were obtained for a small sample (n = 27) of deceased patients primarily to assess any differences in the diagnosis and treatment that might arise from excluding patients who had died.

Follow-up survey

Of the 524 eligible participants who completed the initial survey, 465 completed the follow-up survey, 34 were non-responders, and 10 were lost to follow-up (Table 4). The participation rate in the follow-up survey was over 91% among survivors. Between the initial and subsequent survey (8–17 months), 3% (n = 15) of the patients died.

Table 4 Follow-up survey completion by patient characteristics, AYA HOPE Study

Discussion

The goal of the AYA HOPE Study was to learn about recruitment of AYAs, their cancer care and outcomes and to determine the feasibility of collecting survey and detailed medical record information on a representative population-based sample of AYA cancer survivors. Our response rate among eligible patients was 43%. This age group, particularly those ages 15–25 years, is mobile and difficult to follow due to educational and employment opportunities, marriage and other personal life changes.

Previous studies have shown that response rates of young adults are lower than for older adults [79]. A survey of adult NHL survivors identified from a cancer registry had a 55% response rate, but patients were older and 2–5 years post diagnosis [7]. Despite aggressive follow-up, one recent health survey in adolescents ages 13–17 years yielded a 40% response rate [8]. These adolescents had been seen within the healthcare system in the previous year and were recruited by that system. A national study of childhood cancer survivors between 1994 and 2000 achieved a 62% completion rate with extensive resources [9]. However, the denominator for the completion rate included only those subjects who expressed an interest in the study after being contacted by their treating institution. As indicated by our follow-up survey, once individuals agree to participate in a study, they are more likely to complete subsequent surveys. The response rate in our study might have been higher if, as in the NHL [7] study, more time had elapsed since diagnosis. Some of the patients eligible for our study who refused may still have been under active treatment and may have agreed to participate once therapy was completed.

Males and non-Hispanic black and Hispanic patients were significantly less likely to participate, similar to differences reported for cancer clinical trials [10, 11]. Significantly lower enrollment for minority compared to white patients and for men compared to women was reported for surgical trials [11]. This finding has been shown not only in cancer, but in other diseases as well [12]. Although 37% of our respondents were racial/ethnic minorities, it is likely to have been higher if translated versions of the survey were available, as not speaking English was the primary reason for ineligibility among our population.

A variety of methods were used to maximize responses to the mailing. We enclosed a LIVESTRONG bracelet that added interest and bulk to the packet. One registry initially used an overnight delivery service, but abandoned this within a month. Response rates were lower than with USPS. The overnight service did not leave packages if no one was home, had limited re-delivery, no forwarding addresses, and deliveries were limited to weekdays. While we used a variety of approaches to enhance our response rates, we did not send pre-notification letters or postcards. Research has suggested this may not increase the response rate [12], but it would have identified incorrect addresses at a lower cost than mailing the complete packet.

Overall, 87% of the identified sample required at least one additional mailing or telephone contact. An address or telephone number could not be found for 16% of patients identified as eligible, similar to the 15% reported by the childhood cancer survivors study [9]. Although we likely had the correct phone number for some patients, no one answered our calls. With the increased use of caller identification, patients may have been screening calls from unrecognized phone numbers. It is possible that calls originating from patients’ own medical institutions may have yielded better responses, although a response rate of only 40% was reported by a study conducted within patients in a healthcare organization [8].

Some survivors refused to participate in any part of our study because of the HIPAA medical release form. Evidence from a community-based, randomized, mixed-mode survey (n = 6,939) reported the inclusion of a minimally burdensome version of HIPAA authorization form reduced survey response rates by up to 15% [13]. Simply requiring a signature reduces the response rate. Nelson reported that response rates were 58% at locations requiring no advanced permission to contact the individual compared to 27% for those requiring written permission from the individual [14].

Although some staff used social networking websites to find addresses, we did not use this approach to communicate with patients because registries did not request IRB approval for this type of contact. We were concerned about the confidentiality of a social network contact and identification of the correct person. However, in future studies, social networking websites may be a useful method of contact [15]. This may increase participation as email/web communication is used extensively among young people. However, methods to ensure patient confidentiality are required.

The majority of patients in our study completed the paper rather than the online version. Future studies might evaluate different approaches to increase participation and the use of online surveys. A web address included in the initial mailing that links to a well-designed website describing the study and reasons to participate, and includes the survey as well as other survivorship information might entice AYAs to participate. However, it is possible that AYAs simply prefer to complete the paper version of the survey. A study in Olmstead County found the inclusion of an internet option decreased the response rate [16] while a study in Norway found no increase with the online option [17]. One possibility is that unless thrown away, a paper survey is a constant reminder, whereas a computer can be turned off.

Conclusion

The AYA HOPE Study demonstrates that recruiting and following a diverse population of AYA survivors diagnosed with different cancers and living throughout the US can be accomplished using population-based registries. However, it is important for researchers to be realistic in their expectations, recognizing that this population is mobile and difficult to contact. Achieving a high response rate is challenging, requiring more extensive resources for follow-up. Despite these limitations and challenges, the AYA HOPE Study confirms that the use of cancer registries is a valuable foundation for conducting observational, longitudinal population-based research on younger, non-pediatric cancer survivors.