Hostname: page-component-cd9895bd7-jkksz Total loading time: 0 Render date: 2024-12-23T10:05:12.890Z Has data issue: false hasContentIssue false

A content analysis of the Meaningful Use clinical summary: do clinical summaries promote patient engagement?

Published online by Cambridge University Press:  20 July 2015

Karen Jiggins*
Affiliation:
College of Nursing and Health Innovation, Arizona State University, Arizona, USA
*
Correspondence to: Karen Jiggins, PhD, MBA, RN, College of Nursing and Health Innovation, Arizona State University, 500 N 3rd Street Phoenix, AZ 85004, USA. Email: [email protected]
Rights & Permissions [Opens in a new window]

Abstract

Aim

This study analyzed Meaningful Use (MU) clinical summaries (CS) given to 100 older adults (⩾65) from 10 family physicians in an urban primary care practice.

Background

In the United States, MU was designed to promote and enhance patient engagement in hospitals and clinics across the country, providing financial incentives to physicians attesting to the Meaningful Use of a certified Electronic Health Record by meeting a series of measures and objectives. The CS is intended to support patient and family engagement by communicating elements discussed during the clinical encounter including an updated medication list, problem list, and plan of care (POC). Despite the $27.7 billion spent distributing MU payments to more than 418,000 Eligible Professionals in ambulatory care to date, there is little discussion in the scholarly literature supporting the use of the CS to facilitate patient engagement.

Methods

Ten CS were accessed from each of 10 family physicians during a regular practice week. Directed content analysis and descriptive statistics were used to evaluate the summaries. Key variables of analysis included diagnoses, medications, plan of care content, availability, completeness, health literacy, format, and readability.

Findings

CS contained an average of 5.2 diagnoses and 10 medications. Summaries contained vital signs (98%), lab results (9%), smoking status (88%), professional care team members (4%), follow-up appointments (46%), and POC (67%); 37% of CS were judged to be incomplete. Readability scores indicated that a university education was required to understand the CS. CS support patient engagement by supplying information that supports behavior change and self-management, however barriers to patient engagement exist, including (a) access, (b) poor document readability, and (c) a lack of customization to the patient’s experience.

Type
Research
Copyright
© Cambridge University Press 2015 

Patient engagement enjoyed its formal début with the opening act of Meaningful Use, as participating Eligible Professionals were required to produce a clinical summary for patients and families leaving the outpatient clinic. After-visit summaries have been distributed upon discharge from hospitals for many years with varying degrees of success (Institute of Medicine (IOM), 2014) but it was not until the Electronic Health Record (EHR) Incentive Program (known as ‘Meaningful Use’) administered by the Centers for Medicare and Medicaid Services (CMS) and the Office of the National Coordinator for Health Information Technology (ONC), that ambulatory clinics began distributing a similar document as early as 2012.

The purpose of the clinical summary is to document the plan of care and provide information to assist patients and families in managing their health and healthcare. The clinical summary was designed to further promote patient and family engagement, a national health priority area (Agency for Healthcare Research and Quality (AHRQ), 2013) and a main goal of the EHR Incentive Program (US Department of Health and Human Services (DHHS), 2010). The clinical summary contains various elements as outlined by ONC/CMS, including an updated medication list, problem list, a list of procedures, labs and other orders, instructions given to the patient based on clinical discussions that took place during the visit, the times and locations of upcoming tests and appointments, recommended patient decision aids, and any recent test results (CMS, 2013).

During Stage 1, clinical summaries were printed for patients at the conclusion of their healthcare encounter with the physician. Stage 2 requires that 5% of patients of EPs access (view), download, and transfer electronic personal health information from a practice-based patient portal to other members of their healthcare team (DHHS, 2010, 2012). The provision of the electronic clinical summary through the patient portal (to more than 50% of patients) is unique to Stage 2, as is the requirement that 5% of the EP’s patients use secure messaging (email) to communicate with providers (DHHS, 2012).

Clinical summaries are designed by individual EHR vendors and therefore vary greatly in their composition and style. As long as the clinical summary contains the elements outlined by CMS/ONC, the vendor’s product can be certified and Eligible Professionals may attest to Meaningful Use measures using the certified EHR. Individual physician users may have the ability to alter a vendor-designed clinical summary for content or appearance, depending on the individual EHR product deployed at a given site. The degree of manipulation that is possible varies greatly. Some can be customized per user, others per clinic, and some clinical summaries are not modifiable by the physician-client at all.

Despite the $27.7 billion spent distributing MU payments to more than 514,000 eligible hospitals and professionals in ambulatory care environments to date (CMS, 2014), there is surprisingly little discussion in the scholarly literature about the content of the clinical summary, and even less discussion about whether or not the utilization of a clinical summary actually facilitates patient engagement. As a starting point for further research, this study sought to analyze the Meaningful Use clinical summaries obtained from a certified EHR. IRB approval was obtained through Arizona State University (#00000847).

Methods

Sample

This study sought to analyze 100 clinical summaries selected by convenience sampling. Each summary was produced by practice’s Allscripts’ Touchworks EHR. The summaries were printed from the charts of patients who were over the age of 65. Patients in this age range, who are Medicare eligible, are known to have multiple diagnoses and medications, and therefore offer a sufficiently complex plan of care for examination. Clinical summaries were included regardless of whether they were printed in office on paper (paper summaries) or pushed to the patient portal for the patient to retrieve online (e-summaries).

Setting

Ten summaries were obtained from the schedules of 10 physicians practicing in an urban family practice group in Arizona. The 10 physician providers were hand selected in consultation with the group’s medical director and represented the ‘super users’ of the practice. Therefore, the clinical summaries examined were expected to be the best and most comprehensive summaries available. The 10 physicians chosen were all boarded and experienced (10–40 years of practice) family physicians (five) or internists (five). There were two female and eight male providers.

Data collection strategy

Data was collected on a Thursday. Each physician’s schedule was reviewed starting on Wednesday for eligible patients records. Each appointment on the schedule was reviewed for the age of the patient, choosing patients whose age was 65 or greater, starting with the first appointment of the day and working down to the last appointment of the day. No appointment type was excluded, therefore the collected clinical summaries represent different types of appointments, whether they be annual physicals, acute visits, or follow up appointments for chronic disease management. If 10 summaries were not obtained from the first day’s schedule, the previous day’s schedule was reviewed (starting at the beginning of clinic working toward the end) until 10 summaries could be obtained. It took an average of 2.2 days (minimum 1 day, maximum 4 days) to retrieve 10 summaries for each physician. If the clinical summary printed was incomplete (eg: missing the plan), a clinical note from that visit was printed so that the plan elements could be examined from the encounter note.

Data management

Clinical summaries were printed from the daily schedule page of each participating physician without accessing the patient’s chart, unless the clinical summary was incomplete, in which case, the chart was accessed and a copy of the last office note was also printed. Clinical summaries were marked with an alphanumeric code (physician=A.-J. and case number=1–10). Identifying patient information (name, date of birth, medical record number) that displayed on the clinical summary or office note was removed from each page by detaching the document headers and footers with a paper cutter. Summaries were reviewed by the clinic’s medical director before removal from the site. Folders containing the de-identified clinical summaries were stored in a locked cabinet in a locked research office at ASU. Data from the paper forms were transcribed into a Microsoft Excel spreadsheet and double-checked for accuracy. Data were stored on a dedicated research computer with password protection and encryption using Truecrypt.

Data analysis procedures

Data were analyzed in keeping with general principles of naturalistic research (Denzin and Lincoln, Reference Gazmararian, Williams, Peel and Baker1994; Sandelowski, Reference Sandelowski1995; Glaser and Strauss, Reference Greene, Hibbard, Sacks and Overton2012; Creswell, Reference Creswell2013). Data analysis techniques included descriptive analysis and directed content analysis (Miles and Huberman, Reference Miles and Huberman1994; Sandelowski and Leeman, Reference Sandelowski and Leeman2012; Creswell, Reference Creswell2013). This type of content analysis, used extensively by health researchers, allows investigators to further describe phenomena that are ‘incomplete or would benefit from further description’ (Hsieh and Shannon, Reference Hsieh and Shannon2005: 1281). Summaries were read and re-read multiple times. Content such as instructions or information contained under headings such as reason for visit, diagnoses, medications, plan, allergies, and future appointments were hand-keyed into multiple data matrixes (Miles and Huberman, Reference Miles and Huberman1994) for analysis. Particular attention was directed at the content of the plan or orders section of the clinical summary, as this section ought to contain a set of easy-to-follow instructions for patients and families regarding next steps in the treatment or management of disease. Twenty percent of the clinical summaries (n=20) were scanned into a Microsoft Word document so that they could be entered into an online readability index to assess readability (www.online-utility.org).

Results

A total of 100 clinical summaries were reviewed from 100 patients. The average age of the patients whose clinical summaries were examined was 76 years (65–98). In all, 60% of the sample represented female patients. In total, 11 of the clinical summaries were delivered electronically to patients via the patient portal.

Diagnosis

On average, each clinical summary contained 5.2 diagnoses (1–40). These were listed under a heading labeled ‘Today’s Diagnoses’ suggesting that these lists were comprised of assessed problems not necessarily the greater patient problem list (Figure 1). The diagnosis list was omitted in 19 clinical summaries.

Figure 1 Diagnosis list

Vital signs

A vital sign panel, including blood pressure, temperature, heart rate, weight, BMI calculated, BSA calculated, and O2 saturation, was included in 98 of the clinical summaries (Figure 2).

Figure 2 Vital sign display

Medications and allergies

Every clinical summary examined contained a medication list (n=100). Each clinical summary contained an average of 10 medications (range=1–29). A total of 73 medication lists included over the counter medications such as aspirin and vitamins (Figure 3). The presence or absence (ie: NKDA or NDA) of allergies was noted in all but six clinical summaries.

Figure 3 Medication display

Smoking status

The patient’s smoking status appeared in all but 12 clinical summaries.

Care team

Four clinical summaries contained a list of other providers seen by the patient.

Lab results

In two instances lab results were transcribed into the clinical summary by the physician in the plan section. In seven instances, the plan made mention of that fact that lab results were discussed (five) and printed separately and handed to the patient (two).

Plan of care

Voice

Medical records have traditionally been written in the third person voice. Four physicians produced clinical summaries in the third person (eg, presumed ‘he/she’), essentially giving the patient access to the plan contained in their own encounter note. Five providers primarily used second person to address the reader directly with the subjective or objective case ‘you/yours’ and recorded this personal version of the plan in their own encounter note. One provider made broad use of the first person ‘I/we’, as if giving personal and direct instructions to the patient. Three providers moved back and forth between first, second, and third person voice (Figure 4) extensively.

Figure 4 Range of voice in the plan of care

Synchronicity and completeness

In 33 situations, the plan contained in the clinical summary was different than the plan contained in the provider’s encounter note. In 11 cases, this corresponded directly to the number of e-summaries produced by the provider. During the time frame of this study, there was a technical problem that prevented the plan section of the encounter note from displaying on the 11 e-summaries. Six paper summaries contained no plan for the patient at all and 10 summaries contained a plan with only follow-up information (eg, ‘Return to clinic in 6 weeks to evaluate above problem’) (Figure 5). Five summaries included only medication changes in the plan.

Figure 5 Asynchronous plans

Follow-up

In 46 instances, the plan contained specific instructions about when to follow-up with the physician. In 30 of those instances, a call out box highlighted the follow-up appointment in a separate area of the clinical summary.

Content

The plans contained the following elements (Figure 6): notes about seeing a provider for which a referral was required (23), notes about the diagnosis (23), notes about procedures completed in the office such as ear lavage, cryotherapy, suture removal or vaccine administration (13), orders for radiology (27), orders for laboratory (37), notes about the discussion of lab results with patients (9), medication changes (48), notes or specific instructions for patients about those medication changes (17), notes about the patient’s personal health plan such as immunizations or routine health screenings that were due (11), and instructions or comments for patients (40). In 46 instances, medication changes were highlighted with a separate call out box in addition to the medication list (Figure 7).

Figure 6 Plan of care content

Figure 7 Medication changes in plan highlighted with a call – out box

Availability

A clinical summary was judged to be complete if it contained a problem list, medication list, and a plan of care. A total of 37 paper-based summaries were judged to be incomplete (Figure 8). Reasons for the incomplete paper-based summaries are presented in Table 1. Twelve summaries were not given to patients at check-out, either because the note was not complete (3) or the summary was simply not printed (9). Six clinical summaries were missing a plan of care, and 19 were missing a problem list. Two physicians accounted for half (51%) of the incomplete summaries; 100% of the clinical summaries from one physician and 90% from another were incomplete. Four physicians had no incomplete paper summaries (Table 1).

Figure 8 Electronic clinical summary without a plan of care

Table 1 Incomplete (paper) clinical summaries: physician variation

Readability

A subset of the paper summaries were examined for readability (n=20). A combination of four indexes that determine the amount of education in years a reader must have in order to be comfortable reading the material (Coleman Liau, Flesh Kincaid, Automated Reliability Index, SMOG), demonstrated that the reader of a clinical summary must have been college educated (average of 18.72 years) in order to be comfortable with reading the document. The average Gunning Fog index was 15.37. An ideal score is 7–8; scores above 12 are not suitable for most readers. The average Flesch Reading Ease score was 43.92, indicating that the summaries were only suitable for university graduates (Table 2).

Table 2 Readability scores

Discussion

This study represents an initial attempt to examine the content of Meaningful Use clinical summaries from a practice receiving payments from the federal EHR Incentive Program. A core principle in national quality improvement strategies is the engagement of chronically ill patients in the creation and execution of their treatment plans. Patient engagement is most commonly defined as the ‘actions individuals must take to obtain the greatest benefit from the health care services available to them’ (CAH, 2010; Gruman et al., Reference Denzin and Lincoln2010). A growing body of evidence demonstrates that patient engagement for individuals with chronic illness results in better adherence, superior self-management skills, improved quality of life, enhanced functional and symptom status, fewer re-hospitalizations, and lower health care costs (Hibbard et al., Reference Hibbard, Greene and Tusler2009; Greene et al., Reference Gruman, Rovner, French, Jeffress, Sofaer, Shaller and Prager2013; Hibbard and Greene, Reference Hibbard and Greene2013; Hibbard et al., Reference Hibbard, Greene and Overton2013).

The promise of the clinical summary is tremendous. A thoughtfully crafted clinical summary that contains a plan of care can help patients and families engage by supporting the behaviors in the patient engagement framework (Table 3), such as communicating with heath care professionals, making good treatment decisions, and promoting health (CAH, 2010; Gruman et al., Reference Denzin and Lincoln2010). For example, when the physicians in this study provided a list of diagnoses and medications, they helped patients communicate with other health care professionals. When they provided a summary of the preventative care items that were due, they acted to encourage patients to get preventive health care. When the clinical summaries mentioned the need for a completed advanced life directive, they helped patients plan for the end of life.

Table 3 Clinical summary elements that support engagement through the Engagement Behavior Framework (CAH, 2010; Gruman et al., Reference Denzin and Lincoln2010)

a Elements not observed in this study but that are able to render on the clinical summary document.

The clinical summaries gathered from this practice were produced for older adults, where they are, arguably, desperately needed to enhance engagement with chronic disease self-management. Older adults are more likely to bear the burden of chronic disease; as many as 80% of the 38 million adults over the age of 65 in the United States manage at least one chronic disease (CDC, 2013). Over two-thirds of Medicare beneficiaries have at least two chronic conditions and 14% of Medicare beneficiaries have six or more (Anderson, Reference Anderson2010; CDC, 2013; Lochner et al., Reference Lochner, Goodman, Posner and Parekh2013). For these patients, the clinical summary they receive provides a foundation for chronic disease self-management and engagement in health-promoting behaviors. It is therefore vitally important that the clinical summary contain a plan they can use to monitor their health. Approximately two-thirds of patients received a clinical summary they could use for these purposes and just less than half of the summaries in this study provided clear, actionable, and thoughtful instructions for patients that can be used to enhanced engagement. The plan of care illuminated in this sample demonstrated the variety, complexity and depth of issues managed by primary care providers (PCP) and patients. Significant barriers may prevent the Meaningful Use clinical summary from being as effective as possible in engaging older adults and they will be highlighted here.

Barrier to patient engagement: health literacy

In addition to living with multiple chronic diseases, older adults experience dramatically lower levels of health literacy, defined as ‘the degree to which individuals have the capacity to obtain, process, and understand basic health information and services needed to make appropriate health decisions’ (IOM, 2004: 3). Rates of limited health literacy are high among older adults (Oldfield and Dreher, Reference Oldfield and Dreher2010; Berkman et al., Reference Berkman, Sheridan, Donahue, Halpern and Crotty2011) with only 3% of older adults scoring in the proficient range (Kutner et al., Reference Kutner, Greenberg, Jin and Paulson2006). Limited health literacy is significantly correlated with the ability to engage in the healthcare system and self-management behaviors (Gazmararian et al., Reference Glaser and Strauss2003; Coulter, Reference Coulter2012; Koh et al., Reference Koh, Brach, Harris and Parchman2013; Parker, Reference Parker2013). Readability scores for the subset of clinical summaries tested revealed reading comprehension levels that were suitable only for those comfortable with extremely complex material; those in the 11/12th grades or with a college education. Previous research suggests that patient reading comprehension ranges from grade 5.4 to 10.8 in outpatient clinics and that 40% of patients tested read at the 5th grade level or below (Andrus and Roth, Reference Andrus and Roth2002). Generally speaking, patient education material should be written at a sixth grade level or below (Safeer and Keenan, Reference Safeer and Keenan2005). Clearly, clinical summaries need to be re-worked to reduce complexity and bring down the reading level to levels that reach greater portions of the population.

Clinical summaries need to be much less complex, and EHR vendors ought to create these documents with the principles of health literate documents in mind. While call-out boxes are helpful for important pieces of information, such as those used for follow-up appointments and medication changes, the diagnoses contained on the clinical summaries were often recorded in ICD or SNOMED language without translation, for example, Lumbar Disc Degeneration, Solitary Pulmonary Nodule, Leukocytosis, Cellulitis, or Hypomagnesemia, which an older adult is unlikely to understand. Similarly, one wonders how many patients understand was BSA stands for, why they should be concerned about their body surface area, or what they should do about it (Figure 2). Some summaries contained acronyms such as CPE, FU, CBC, and CMP as well as CPT codes (Figure 9). While common to healthcare providers and staff, these data points may be confusing or meaningless for the patients we serve. We should aim to include only relevant, meaningful information on the clinical summary and refrain from displaying elements that are not actionable or helpful. Not only is this good practice for the creation of patient education material, but presenting only relevant and motivating information particularly helps those with limited health literacy.

Figure 9 Lab orders with CPT codes that may be hard to understand

Barrier to patient engagement: computer use

Older adults are less likely than younger patients to use the computer, especially for the purpose of gathering health related information (Jiggins, Reference Jiggins Colorafi2014). Although 69% of US households reportedly use the internet, users are disproportionately younger, healthier, wealthier, and more educated than non-users (Wen et al., Reference Wen, Kreps, Zhu and Miller2010; Choi and DiNotto, Reference Choi and DiNotto2013). Only about half of all adults over the age of 65 in the United States use a computer (Keenan, Reference Keenan2009), and only 34% of those over the age of 76 (Zickhur and Madden, Reference Zickhur and Madden2012). Indeed, only 11% of the summaries in this study were pushed to a patient portal, reflecting the preferences of older adults not to use the computer/internet. Only patients with an email address and stated preference for the e-summary received one; all others were printed. Although Meaningful Use Stage 2 cautiously encourages the use of health information technology for consumers, such as patient portals and secure messaging with providers (DHHS, 2012), the large and growing cohort of older adults in this country may not be able to interact with the healthcare system in new and increasingly electronic ways.

Barrier to patient engagement: availability

In order to support engagement behaviors, the clinical summary must be transmitted to the patient. In 48% of the cases, a complete clinical summary was not made available to patients at the end of their visit. A portion of these (11%) were due to technical problems outside of the practice’s control, such as EHR vendor programming errors, causing the e-summaries to render poorly (Figure 10). Problems such as these put physicians in the difficult position of not complying with federal policy through no fault of their own, and more importantly, cause a missed opportunity to affect the patient experience. However, the variability found between physicians (ranging from 0 to 10 incomplete clinical summaries each) suggests that individual users have a significant impact on the availability and quality of the clinical summary (Table 1).

Figure 10 Uninterpretable e-summary

Barrier to patient engagement: user variation

Great variation was noted between physicians in the type of information they communicated to patients, and one can assume, the amount of time spent creating the plan of care. The voice in which physicians selected was interesting and warrants further study. The PCPs in this sample took great care to provide reassurance and instructions to their patients in a causal, friendly countenance. While some clinical summaries contained a sparse message to ‘return to clinic to evaluate above problem’ which communicates very little in the way of chronic disease management, other physicians created hand-keyed plans that were incredibly detailed and personal, reflecting the intimate relationship patients and families have with their PCPs. For example, ‘If you cannot get with the VA, call me and we will get a referral to another dermatologist,’ ‘Put the drops in and close your eyes for about a minute. It does sting, but really helps,’ ‘You need to work on exercise and weight loss. Exercise is important. Your blood pressure looks good,’ and ‘We discussed your lab results today and I hope I answered all of your questions. If not, please let me know.’

EHR vendor technology can be helpful in these instances when certain phrases likely to be utilized frequently can be created ahead of time and dropped in the note as needed, such as ‘Continue current therapy as prescribed,’ ‘Please call if symptoms worsen or fail to resolve as further evaluation may be needed,’ ‘An option for your imaging is (name, phone, and address of vendor),’ ‘We’ve given you a copy of your labs for your reference,’ or ‘Please read and review the health maintenance handout we gave you today.’ They can make the plan appear more personal to the patient without taking a large amount of time on the part of the physician (Figure 11). The reliability with which elements such as vital signs, diagnoses and medication lists, and smoking status appeared in the clinical summaries testify to the power of good programming.

Figure 11 Examples of plans using template items

Limitations

Limitations for this study include the sample size (n=100) of summaries collected. The results suggest that one physician tends to produce the same type of clinical summaries (eg, making the same errors of omission), and so sampling fewer summaries from more physicians may have been more revealing. In addition, no appointment type was excluded from this study. Since ~10% of the summaries collected were not distributed to patients because the clinical note was incomplete, it may have been interesting to analyze summaries based on appointment type to determine if documentation requirements or habits changed based on this variable.

Conclusions and practice implications

In conclusion, the Meaningful Use clinical summary can and likely does enhance patient and family engagement, but there are significant barriers that stand in the way of its effective use which need to be resolved. These include (a) reducing the complexity of the clinical summary and increasing its readability, including eliminating technical issues that make the e-summary less useful, (b) advocating for patients who will not or cannot use the computer, together with policy changes that stop penalizing physicians for distributing clinical summaries on paper to older adults, (c) addressing physician variation in documentation, and (d) improving the reliability of distribution of the clinical summary.

As our documentation and technology becomes more sophisticated, thanks in large part to the success of the Meaningful Use program, so will the information we produce from the EHR. We must harness the opportunity to improve the clinical summary and enhance engagement in a cohort of Medicare-eligible older adults with multiple comorbidities, previously found to have the lowest levels of activation, or propensity to engage, in the nation (Hibbard and Cunningham, Reference Hibbard and Cunningham2008). Electronic summaries hold even more promise as they could be programmed to allow patients to click on hyperlinks and learn more about diagnoses, treatments, or medication, or to compare costs of radiology orders at various imaging centers. Imagine moving from a text-based list of action items toward an interactive plan that demonstrates with video how to perform back stretching exercises or how to change a dressing, or that links you to a cooking show demonstrating heart-healthy meal preparation techniques. With emerging technology and informatics, the opportunity to effectively communicate with our patients, through the clinical summary and through other methods, will take a quantum leap forward, and patient engagement will as well.

Acknowledgment

The author wishes to acknowledge Dr. Bronwynne Evans for her assistance in data analysis strategy and Dr. Karen Marek for her editorial review.

Financial Support

Research assistance for data analysis and manuscript development was supported by training funds from the National Institutes of Health/National Institute on Nursing Research (NIH/NINR), award T32 1T32NR012718-01 Transdisciplinary Training in Health Disparities Science (C. Keller, P.I.). The content is solely the responsibility of the authors and does not necessarily represent the official views of the NIH or the NINR. This research was supported through the Hartford Center of Gerontological Nursing Excellence at Arizona State University College of Nursing & Health Innovation.

Conflicts of Interest

None.

References

Agency for Healthcare Research and Quality (AHRQ). 2013: 2013 Annual Progress Report to Congress: National Strategy for Quality Improvement in Health Care. Retrieved 12 May 2015 from arhq.gov/workingforquality/nqs/nqs2013annlrpt.htm.Google Scholar
Anderson, G. 2010: Chronic care: making the case for ongoing care. Robert Wood Johnson Foundation. Retrieved 12 May 2015 from http://www.rwjf.org/en/library/research/2010/01/chronic-care.html.Google Scholar
Andrus, A. and Roth, M. 2002: Health Literacy: A review. Pharmacotherapy 22, 282302.CrossRefGoogle ScholarPubMed
Berkman, N.D., Sheridan, S.L., Donahue, K.E., Halpern, D.J. and Crotty, K. 2011: Low health literacy and health outcomes: an updated systematic review. Annals of Internal Medicine 155, 97107.CrossRefGoogle ScholarPubMed
Center for Advancing Health (CAH). 2010. A new definition of patient engagement: what is engagement and why is it important? Washington, DC: Center for Advancing Health.Google Scholar
Centers for Medicare and Medicaid Services (CMS). 2013. Eligible Professional Meaningful Use Core Measures: Measure 13 of 14. Washington, DC: Centers for Medicare and Medicaid Services Retrieved 12 May 2015 from http://www.cms.gov/Regulations-and-Guidance/Legislation/EHRIncentivePrograms/downloads/13_Clinical_Summaries.pdf.Google Scholar
Centers for Medicare and Medicaid Services (CMS). 2014: Data and program reports. Retrieved 12 May 2015 from http://www.cms.gov/Regulations-and-Guidance/Legislation/EHRIncentivePrograms/DataAndReports.html.Google Scholar
Choi, N. and DiNotto, D. 2013: The digital divide amon glow-income homebound older adults: internet use patterns, eHealth literacy, and attitudes toward computer/internet use. Journal of Medical Internet Research 15, e93.CrossRefGoogle Scholar
Centers for Disease Control (CDC). 2013. State of aging and health in America. Atlanta, Georgia: CDC.Google Scholar
Coulter, A. 2012: Patient engagement: what works? Journal of Ambulatory Care Management 35, 8089.CrossRefGoogle ScholarPubMed
Creswell, J. 2013. Qualitative inquiry and research design: choosing among five approaches, third edition. Los Angeles: Sage.Google Scholar
Denzin, N. and Lincoln, Y. 1994. The handbook of qualitative research. New York: Sage.Google Scholar
Gazmararian, J.A., Williams, M.V., Peel, J. and Baker, D.W. 2003: Health literacy and knowledge of chronic disease. Patient Education and Counseling 51, 267275.CrossRefGoogle ScholarPubMed
Glaser, B. and Strauss, A. 2012. The discovery of grounded theory: strategies for qualitative research. New Brunswick, USA: Aldine Transaction.Google Scholar
Greene, J., Hibbard, J.H., Sacks, R. and Overton, V. 2013: When seeing the same physician, highly activated patients have better care experiences than less activated patients. Health Affairs 32, 12991305.CrossRefGoogle ScholarPubMed
Gruman, J., Rovner, M., French, M., Jeffress, D., Sofaer, S., Shaller, D. and Prager, D. 2010: From patient education to patient engagement: implications for the field of patient education. Patient Education and Counseling 78, 350356.CrossRefGoogle ScholarPubMed
Hibbard, J. and Cunningham, P. 2008. How engaged are consumers in their health and health care and why does it matter?. Washington, DC: HSC Research Brief No, 8.Google ScholarPubMed
Hibbard, J. and Greene, J. 2013: What the evidence shows about patient activation: better health outcomes and care experiences; fewer data on costs. Health Affairs 32, 207214.CrossRefGoogle ScholarPubMed
Hibbard, J., Greene, J. and Tusler, M. 2009: Improving the outcome of disease management by tailoring care to the patient’s level of activation. American Journal of Managed Care 15, 353360.Google Scholar
Hibbard, J.H., Greene, J. and Overton, V. 2013: Patients with lower activation associated with higher costs; delivery systems should know their patients’ ‘scores’. Health Affairs 32, 216222.CrossRefGoogle ScholarPubMed
Hsieh, H. and Shannon, S. 2005: Three approaches to qualitative content analysis. Qualitative Health Research 15, 12771288.CrossRefGoogle ScholarPubMed
Institute of Medicine (IOM). 2004. Health literacy: a prescription to end confusion. Washington, DC: Institute of Medicine.Google Scholar
Institute of Medicine (IOM). 2014. Facilitating patient understanding of discharge instructions. Washington, DC: Institute of Medicine.Google Scholar
Jiggins Colorafi, K. 2014: Computer use by older adults: a review of the literature. Journal of Gerontology and Geriatric Research 3, 164177.Google Scholar
Keenan, T. 2009. Internet use among midlife and older adults: an AARP bulletin poll. Washington, DC: AARP.Google Scholar
Koh, H., Brach, C., Harris, L. and Parchman, M. 2013: A proposed ‘Health Literate Care Model’ would constitute a systems approach to improving patients’ engagement in care. Health Affairs 32, 357367.CrossRefGoogle ScholarPubMed
Kutner, M., Greenberg, E., Jin, Y. and Paulson, C. 2006. The health literacy of America’s older adults: results from the 2003 National Assessment of Adult Literacy. Washington, DC: Institute of Educational Sciences.Google Scholar
Lochner, K., Goodman, R., Posner, S. and Parekh, A. 2013: Multiple chronic conditions among medicare beneficiaries: state-level variations in prevalence, utilization, and cost, 2011. Medicare and Medicaid Research Review 3, E2E19.CrossRefGoogle ScholarPubMed
Miles, M. and Huberman, M. 1994. Qualitative data analysis. Thousand Oaks: Sage.Google Scholar
Oldfield, S.R. and Dreher, H.M. 2010: The concept of health literacy within the older adult population. Holistic Nursing Practice 24, 2042012.CrossRefGoogle ScholarPubMed
Parker, R. 2013: Advancing health literacy. Paper presented at the Health Literacy: a prescription for patient engagement, Kansas City, MO.Google Scholar
Safeer, R. and Keenan, J. 2005: Health literacy: the gap between physicians and patients. American Family Physician 72, 463468.Google Scholar
Sandelowski, M. 1995: Qualitative analysis: what it is and how to begin. Research in Nursing and Health 18, 371375.CrossRefGoogle ScholarPubMed
Sandelowski, M. and Leeman, J. 2012: Writing usable qualitative health research findings. Qualitative Health Research 22, 14041413.CrossRefGoogle ScholarPubMed
US Department of Health and Human Services (DHHS). 2010. Medicare and medicaid programs; electronic health record incentive program; final rule. Washington, DC: Federal Register Retrieved 12 May 2015 from www.gpo.gov/fdsys/pkg/FR-2010-07-28/pdf/2010-17207.pdf.Google Scholar
US Department of Health and Human Services (DHHS). 2012: Medicare and medicaid programs; electronic health record incentive program – Stage 2; health information technology: standards, implementation specifications, and certification criteria for electronic health record technology, 2014 edition; Revisions to the Permanent Certification Program for Health Information Technology; Final Rules Federal Register, US Department of Health and Human Services, Washington, DC.Google Scholar
Wen, K.-Y., Kreps, G., Zhu, F. and Miller, S. 2010: Consumers’ perceptions about and use of the internet for personal health records and health information exchange: analysis of the 2007 Health Information National Trends Survey. Journal of Medical Internet Research 12, 122137.CrossRefGoogle ScholarPubMed
Zickhur, K. and Madden, M. 2012. Older adults and internet use Pew Research Internet Reports. Washington, DC: Pew Research Center.Google Scholar
Figure 0

Figure 1 Diagnosis list

Figure 1

Figure 2 Vital sign display

Figure 2

Figure 3 Medication display

Figure 3

Figure 4 Range of voice in the plan of care

Figure 4

Figure 5 Asynchronous plans

Figure 5

Figure 6 Plan of care content

Figure 6

Figure 7 Medication changes in plan highlighted with a call – out box

Figure 7

Figure 8 Electronic clinical summary without a plan of care

Figure 8

Table 1 Incomplete (paper) clinical summaries: physician variation

Figure 9

Table 2 Readability scores

Figure 10

Table 3 Clinical summary elements that support engagement through the Engagement Behavior Framework (CAH, 2010; Gruman et al., 2010)

Figure 11

Figure 9 Lab orders with CPT codes that may be hard to understand

Figure 12

Figure 10 Uninterpretable e-summary

Figure 13

Figure 11 Examples of plans using template items