Skip to main content

Learning health systems using data to drive healthcare improvement and impact: a systematic review

Abstract

Background

The transition to electronic health records offers the potential for big data to drive the next frontier in healthcare improvement. Yet there are multiple barriers to harnessing the power of data. The Learning Health System (LHS) has emerged as a model to overcome these barriers, yet there remains limited evidence of impact on delivery or outcomes of healthcare.

Objective

To gather evidence on the effects of LHS data hubs or aligned models that use data to deliver healthcare improvement and impact. Any reported impact on the process, delivery or outcomes of healthcare was captured.

Methods

Systematic review from CINAHL, EMBASE, MEDLINE, Medline in-process and Web of Science PubMed databases, using learning health system, data hub, data-driven, ehealth, informatics, collaborations, partnerships, and translation terms. English-language, peer-reviewed literature published between January 2014 and Sept 2019 was captured, supplemented by a grey literature search. Eligibility criteria included studies of LHS data hubs that reported research translation leading to health impact.

Results

Overall, 1076 titles were identified, with 43 eligible studies, across 23 LHS environments. Most LHS environments were in the United States (n = 18) with others in Canada, UK, Sweden and Australia/NZ. Five (21.7%) produced medium-high level of evidence, which were peer-reviewed publications.

Conclusions

LHS environments are producing impact across multiple continents and settings.

Peer Review reports

Introduction

The transition to digital health including electronic medical records (EMR) is creating the opportunity and expectation that big data will drive the next frontier of healthcare improvement and transformation. However, there are many barriers to data driven healthcare improvement and many approaches have emerged including the Learning Health System (LHS). A LHS was originally defined by the Institute of Medicine as a broader system in which “science, informatics, incentives, and culture are aligned for continuous improvement and innovation, with best practices seamlessly embedded in the delivery process and new knowledge captured as an integral by-product of the delivery experience” [1, 2]. LHS models embed data-driven research within healthcare, integrating infrastructure and multidisciplinary expertise to deliver improved health [1, 3,4,5,6], via improved access to, and increase use of data to inform clinical decision making [6, 7]. LHS apply cyclical processes to turn practice into data, analyse it to generate new knowledge and then implement this knowledge into practice in an ongoing and timely manner to support near-time improved healthcare and outcomes. A LHS is service-led and community-led to ensure relevant issues are addressed (relevant to clinicians and patients). In a LHS, higher quality, safer, more efficient care is anticipated [8,9,10], and health delivery organizations become better places to work [8]. The LHS in principle offers a data-driven approach to develop healthcare improvement initiatives incorporating cyclical systems and processes, expertise and resources within a central data hub [6, 11].

The LHS was prioritised in a national process to establish joint priorities using a modified Delphi process and nominal group technique. Stakeholders involved in the priority setting process included representatives from national health data organisations, government agencies, consumers and all centres from the Australian Health Research Alliance [7]. However, only a minority of healthcare organisations worldwide function as a LHS, according to only 15% of global healthcare leaders who described their organisations as adept in data-driven processes to support informed point of care decisions [12]. Evidence of and learnings from functioning LHS that have improved healthcare, are now vital to accelerate adoption and enable digital medicine to iteratively generate new knowledge and shape healthcare moving forward.

A prior 2016 systematic review examined impacts arising from a LHS and identified five papers from four LHS environments all in the United States [3]. The literature in LHS is growing with ten citations in 2007 peer-reviewed literature and over 1000 in 2017 [4]. Yet this field has been plagued by a lack of consistent terminology including data hubs, living labs, incubator, innovation or informatics hubs, learning networks, learning laboratories, community-clinician participatory data healthcare research, data driven improvement initiatives, interventional informatics, practice based data networks, circular data-driven healthcare and the LHS (refs). The LHS “community’ is also fragmented, with a lack of awareness of other’s work and limited shared learnings, leading to duplication and the lack of a critical mass of researchers and thought leaders to address barriers to adoption, maintenance, reach and sustainability [4].

For a LHS to generate new knowledge and shape the delivery and transformation of healthcare, arguably these should be health service and community-led to ensure priority areas for clinicians and patients are addressed in ways that are relevant to local settings, resources and health care systems. However, despite the availability of big data from health care, little is known about how to create effective, sustainable and service-led LHS environments that stimulate partnerships across academic, clinician, community, primary care and industry stakeholders to utilise data to iteratively to achieve better health outcomes and service improvements. To address this, an effort is underway to develop a framework for a national network of sustainable LHS data hubs across Australia. A co-design process was applied including national stakeholder engagement, governance, semi-structured interviews with international and national stakeholders and workshops were completed. To inform this work, we aimed to complete a systematic review on LHS (or similar entities with alternative names) facilitation of data-driven healthcare improvement and impact. Any reported impact on the process, delivery or outcomes of healthcare was captured. This addresses a key knowledge gap on the impacts of LHS [13].

Although some literature identifies a LHS having operational precision medicine capabilities at point of care [4], we took a broader definition which was informed by stakeholder needs. We define a LHS as a system in which routine health practice data, from service delivery and patient care, can lead to iterative cycles of knowledge generation and healthcare improvement.

Method

We followed the PRISMA (Preferred Reporting Items for Systematic reviews and Meta-Analyses) statement for conducting and reporting a systematic review [14]. This review was registered in PROSPERO (CRD42020153319).

A systematic search of both academic and grey literature identified available publications that met the inclusion criteria. To ensure a comprehensive representation of the literature, we included publications that used qualitative, quantitative, mixed and case study methodologies, and cross-sectional, cohort, experimental and observational designs. The review processes are provided below. Also see the section describing author contributions for further details of who undertook the review tasks.

Data sources and search strategy

An electronic search was conducted of Scopus, CINAHL, EMBASE, MEDLINE, Medline in-process and Web of Science databases, in March 2019 and again to check for any new publications in September 2019. Abstracts and publications were imported into and managed within EndNote × 8 (https://endnote.com/wp-content/uploads/m/pdf/en-x8-qrg-windows.pdf). A library scientist (AY) guided the search strategy, using a combination of keywords and wildcards, with appropriate truncations tailored for each database. The code used to search each electronic database are shown in Appendix 1. Publications were limited to English language and the past 5 years (2014 – present) to maintain currency as an emerging field and update the last systematic review in 2016 [3]. To ensure a comprehensive representation of the literature, qualitative, quantitative, mixed methods and case study methodologies, and cross-sectional, cohort, experimental and observational designs were included. To identify any additional articles, the reference lists of included publications were searched manually.

Study selection

Titles and abstracts of retrieved publications were screened independently by two reviewers (JE, ACJ) to identify publications that potentially met the inclusion criteria. Full text of potential eligible publications were retrieved and independently assessed for inclusion by the same authors. When discrepancies occurred, consensus was reached through discussion between reviewers.

Inclusion criteria

Inclusion criteria included publications that described an operating LHS (research focused on LHS data analysed) and translation of research evidence generated from LHS data into healthcare improvement. Table 1 outline the publication types about LHS, and indicates the type sought in this review. Appendix 2 shows the template used to determine eligibility. Exclusion criteria included post hoc analyses using registry or survey data, animal research, poster abstracts, basic research, non-English language articles, publications before 2014 and research in a low or middle income country using World Bank Atlas classifications [15]. The review focused on high income countries, as LHS require rapidly developing and sophisticated data driven systems which need advanced infrastructure, skills and systems, generally not yet established in low or middle income countries. Articles that were not reporting primary studies (e.g. reviews, editorials, commentaries, opinion pieces) were also excluded but, if relevant, reference lists were checked for additional eligible articles.

Table 1 Journey of a LHS and evidence of impact in the literature. Ticks indicate literature types readily available at the time of writing. *This review seeks to identify the evidence and research translated into the LHS environment

Publications were included if they reported the following according to the Participants, Intervention, Comparator and Outcome (PICO) approach [16]):

  • Participants included health providers (key and could not be nonessential or passive participants) and the setting included community and health care organisation(s) delivering services to patients;

  • Interventions such as initiatives using data for healthcare improvement, new data capability embedded in health services to drive utilisation of data for the purpose of healthcare improvement, embedded data roles, knowledge mobilisation or brokering (with data as significant component), improving data capability (e.g. how to use existing data), usage of live (key and could not be nonessential or passive participants) analytics such as dashboards (e.g. by clinical staff) and data feedback mechanisms involving clinicians.

  • Comparators were not essential

  • Outcomes in eligible articles reported evidence of a LHS translating data-driven research into healthcare, with measurable impact at the patient or service improvement level (e.g. patient health impact measures; patient self-reported outcomes, measures of utilisation of best practice guidelines, clinical variation, access to integrated service systems utilising data and evidence of translation into practice.

Grey literature

Peer-reviewed literature was supplemented with a search of the grey literature using a general Internet search with Google and Google Scholar. In addition, we asked the investigators and stakeholders (n = 26, identified as working and providing leadership in data hubs, health care services and/or research and interviewed in a related study about LHS [17]) to identify relevant sources of literature in the form of websites, newsletters, online or print reports, annual reports, research or quality assurance reports, any persons that had established a data hub, and any another relevant contact person. Free text searching used the same search terms, and inclusion and exclusion criteria noted above. The search of the grey literature ended Sept 2, 2019.

Data extraction and quality assessment

One author (JE) extracted data from the included publications and identified the level of evidence. Publications with heterogeneous study designs were anticipated, therefore the GRADE Approach was applied to assess overall quality of evidence based on the study design [18]. In the GRADE approach, randomized trials without important limitations provide high quality evidence, and observational studies without special strengths or important limitations provide low quality evidence. GRADE recommends that design factors such as ‘concurrent controls’ can improve the quality of evidence, therefore, studies with concurrent controls without important limitations were assessed as providing medium quality evidence. We also assigned a level of evidence as ‘0’ if publications could not be assessed because it was (a) a peer-review publication that stated the translational benefits of a LHS but provided no objective evidence as no values were provided, or (b) a non-peer reviewed article i.e. grey literature.

Data synthesis & analysis

Due to the heterogeneity of interventions, study designs and outcomes, narrative synthesis methods were used. Narrative synthesis collates the collective findings into a coherent, textual narrative, and is appropriate when the review question dictates the inclusion of a wide range of research designs, producing qualitative and/or quantitative findings for which other approaches to synthesis are inappropriate [16].

LHS impact categories were determined by authorship panel of experts and were based on the healthcare improvement and impact reported in the study. These categories were designed to be broad and inclusive, acknowledging that benefits were often noted across categories, hence the primary reported outcomes determined the final categorisation. These categories were: Benefits to patients; Benefits to clinician and patient encounters; Benefits to clinical services, organisation and system-level performance, and; Benefits to research and evidence generation.

The included studies were grouped together based on the overarching LHS concept. This was done because the review aimed to gather evidence on the effects of LHS (or similar entities with alternative names) and a LHS by design is a system level intervention that includes multiple processes and projects. This is an accepted process for reporting diverse health-related initiatives in a single peer-reviewed research publication.

Results

The search identified 1076 titles after duplicate removal. Screening of titles and abstracts excluded 946 of these. The remaining 124 full-text articles were examined and a further 81 excluded. This left 43 articles meeting inclusion criteria. Overall, the bibliography database search only identified 26% (11/43) of articles and the grey search identified the remainder. Search results are summarised in the PRISMA flow diagram in Fig. 1 and in Table 2.

Fig. 1
figure1

PRISMA Flow Diagram

Table 2 Learning health systems with reported outcomes. *Peer-review article

The included 43 articles described translation into health impact across 23 LHS environments: 18 in USA, two in Canada, one each in UK, Sweden and Australia/NZ. LHS settings include local (5), regional (9) and national (9). At least one peer-reviewed article was available for each of the 23 LHS except one; Connected Health Cities in the UK, only reported in the grey literature with a correspondence article [35] and a final report [61]. This LHS also reported at least 20 research projects on a webpage, but not all had achieved outcomes at the time of writing (Table 2).

The remaining 41 articles were peer-reviewed. These comprised quantitative (n = 33), qualitative (n = 2), and mixed-method (n = 2) studies as well as (n = 4) publications that stated improvements but no figures were provided (and therefore assigned ‘0’ level of evidence in this review) [30, 40, 51, 56]. Five quantitative studies included a control group and were randomised controlled trials [23, 34, 36, 37, 43]. Another was a comparative study with concurrent controls [57]. Twenty-seven publications used uncontrolled quantitative approaches, predominantly reporting observational data from registries or electronic medical records (EMRs).

These 23 LHS environments can be categorised as:

  • nine real-world data enabled: electronic health record and/or linked data [19, 23,24,25, 34, 35, 37,38,39, 51, 53,54,55, 57, 58, 62]

  • six built around clinical registries [21, 22, 29, 31, 40,41,42, 47, 48, 52]

  • four community of practice networks [20, 27, 28, 32, 33, 49, 50]

  • two academic health centre initiated [43,44,45,46, 59, 60]

  • one finance staff and physicians/surgeons collaboration [30]

  • one commercial operation [56]

Most LHS in this review were enabled by digital data gathered from EMR’s using analytic techniques to translate data to generate new knolwedge and improve clinical or service performance [19, 23,24,25, 34, 35, 37,38,39, 51, 53,54,55, 57, 58, 62]). Other LHS were built around clinical registries housing uniformly collected data used to describe populations with specific diseases or characteristics and monitor their outcomes such as the registers used by ImprovingCareNow [21, 22], the Swedish Rheumatology Society [40,41,42] or the national Cystic Fibrosis Foundation in the United States [52].

Some LHS were initiated by services creating a community of practice particularly when linking smaller site to other sites to share learnings and expand data pools [20, 27, 28, 32, 33, 49, 50, 63]. One LHS, Optum Labs, was a commercial operation [56] collaborating with an academic partner, Mayo Clinic.

There were 14 service-led LHS identified in this review [19, 20, 23,24,25, 27, 28, 30, 32,33,34,35, 37,38,39, 49,50,51, 53,54,55, 57, 58, 62]. The service-led LHS in this review had been initiated and enabled because of newly implemented digital health data and analytic techniques (e.g. [19, 23,24,25, 34, 35, 37,38,39, 51, 53,54,55, 57, 58, 62]), as well as the creation of new community of practice networks [20, 27, 28, 32, 33, 49, 50]. Another service-led LHS was initiated by finance leaders in the hospital establishing a respectful and valued collaboration with the physicians/surgeons, and both groups drove the LHS to create more efficient and better surgical outcomes [30]. Improving patient access and interaction with information was key to improving patient experience and outcomes in one service-led LHS [51].

Benefits to patients

Benefits achieved for patients were largely due to better evidence based care provided because of site/clinician benchmarking and individual patient record longitudinally tracking care and outcomes readily available at point of care. Examples of patient benefits included identifying distress and despair in cancer patients [19], decreasing postoperative complications (17.7–9.6%) [20], increasing patients in remission [21], shorter waits for lung cancer treatment commencement after referral (median 92 reducing to 47 days) [28], and reductions in polypharmacy by 6% [35].

Identifying distress was achieved in in ambulatory cancer care patients by electronically sending questionnaires prior to a visit, responses were automatically integrated into the patient EMR, and clinicians notified of clinically elevated symptoms through messages, which then facilitated referral to psychosocial and supportive care. Psychosocial concerns were reported by 34%; common psychosocial needs were information on advance directives (16%), support with managing stress (15%), information on financial resources (11%), coping with cancer diagnosis (10%), information on support groups (9%) and 25% indicated that they would like to be contacted by a health educator for assistance finding health-related information [25].

Benefits to clinician and patient encounters

Some LHS enabled patients to track and self-manage their condition, and enable quicker and more evidence-informed decisions for clinical practice, public reporting, and research as well as for clinical process improvement [19, 42]. For example, the LHS Swedish Rheumatology Registry enables a patient to record symptoms, health status, and quality of life directly into their EMR before a clinical encounter. Patients access their own EMR at a clinic using a computer tablet or at home via a patient internet portal. The system combines these data with other data (clinical examinations and laboratory results) to give a graphical display of the patient’s health status, as well as a time graph of trends in the person’s health and treatment. The patient and clinician can view this together, or separately, and this helps the patient and clinician to partner to optimize health. Data was also exported to the national registry, enabling research to contribute to improving patient population health. Evaluations have found that patients greatly value this system for the knowledge it gives them about their changing condition and symptoms over time [42, 64].

Benefits to clinical services, organisation and system-level performance

Benefits to service delivery were also evident e.g. time savings of seven minutes per patient visit due to automatic data transfer [22], compliance with evidence-based clinical guidelines improved by 20% [28], pneumococcal vaccination increased 62 to 90% and colorectal cancer screening from 69 to 81% [46].

An essential component of a LHS is a collaborative platform that provides connectivity across silos, organizations, and professions. Automated reports using the data from the entire LHS led to the efficient identification of patients for standardised care, specialised care, follow-up or clinical trials [21, 27, 28]. Collection of information directly from patients before the clinical encounter can improve time efficiencies, and create PROMs (patient reported outcome measures) that are saved within the EMR that enable longitudinal tracking of individual patient outcomes and aggregated research [19, 42].

Data architecture appears to be trending away from the traditional relational database and towards a hybridization of big data and high performance computing. This is driven by the differing data sources held at different sites that can be linked for the purpose of analysis (ref), or aggregated versions compared as benchmarks (ref). Benchmarking site performance can now be easily provided using aggregrated data from each site, and it has the advantage that no individual information is released. Aggregated benchmarking comparisions between clinics/services was reported to lead to subtansitial benefits in the six LHS built around clinical registers [21, 22, 29, 31, 40,41,42, 47, 48, 52]. The Cystic Fibrosis Foundation attributes publishing of clinic performance on a public website as an important driver for greater standardisation and implementation of evidence based care in routine practice [52].

The two LHS initiated by academic health centres had produced publications about implementation issues to develop a system-wide LHS. [46, 59] These publications acknowledged the premise of the LHS was embraced and theoretically endorsed for years, but the translation of the LHS approach and implementation into healthcare was a difficult and long undertaking. They then went on to describe longterm (> 5 years) system-level performance improvements resulting in multiple domains: patient satisfaction, population health screenings, improvement education, and patient engagement [43,44,45,46, 59, 60]. They both proposed that their experience in developing a large healthcare setting into a LHS can be applied to other health systems that wrestle with making system-level change when existing cultures, structures, and processes vary.

Benefits to research and evidence generation

LHS models include the ability to augment participation in pragmatic real-world trials, comparison effectiveness trials, identify adverse drug effects, and follow data-driven guidelines. Efficient data extraction can directly facilitate evaluation of improvement efforts and can be used to collect data from clinical trials with reduced patient, health service and research team burden.

Quality assessment of publications

Level of evidence for included publications are shown in Table 3. Level of evidence was assessed as high for the five RCT publications [23, 34, 36, 37, 43] and medium for one comparative study with concurrent controls [57]. A low level of evidence was assigned to the 27 publications reporting observational data from registries or EMR. Five (21.7%) of the LHS environments produced medium-high level of evidence peer-reviewed publications. These five LHS were all in the United States: three regional and two national. No evidence (lowest rating) was assigned to six articles that could not be adequately assessed for level of evidence because four were peer-review publications stating translational benefits of a LHS but no figures provided [30, 35, 40, 51, 56, 61], and two were grey literature [35, 61]. The two mixed method studies were assessed as providing low level of evidence, based on the assessment of the quantitative components. The two qualitative studies were not assessed.

Table 3 Level of evidence as classified by the study design for the publications included in this review. A level of evidence as ‘0’ was assigned if article could not be assessed because it was (a) a peer-review publication that stated translational benefits of the LHS but no figures provided, or (b) it was a non-peer reviewed article i.e. grey literature

Discussion

With the flood gates of health data now open, there are clear opportunities to turn practice into data, data into new knowledge and knowledge into improved practice, however there is limited evidence of effective systems level approaches and processes to deliver on these opportunities. This systematic review and narrative evidence synthesis shows that LHS environments are increasing with demonstrated health benefits across multiple continents and a range of settings. LHS built on electronic medical records and/or linked data, clinical registers, community of practice networks, academic health science centre partnerships, medical collaboration or commercial operations. Benefits were noted in patient self-management, evidence-based clinician care, clinical organisation or system-level performance and in research. Core features of LHS included having strong partnerships, generating a shared vision across stakeholders, having agreed principles and governance, implemented systems and processes to enable iterative sustainable improvement, and longitudinally benchmarking and patient tracking with feedback to frontline patients, clinicians and health services. System-level performance improvements were evident in multiple domains: patient satisfaction, population health screenings, improvement education, and patient engagement. Quality was variable and limitations included poor alignment of terminology.

The novelty of this systematic review compared to past LHS reviews lies in the research aim, inclusion criteria and the systematic methods. This resulted in included studies that needed to report impact on the process, delivery or outcomes of healthcare arising from the LHS. Unlike a recent white paper [65] and other LHS reviews [4, 6, 13, 65], here papers were excluded if they described a LHS or usage of data in a LHS, without reporting impact. As noted by Foley and Vale (2017) [13], further research to evaluate the impacts of LHSs is needed, and we sought to advance this in the current systematic review.

The previous systematic review on LHS in 2016 identified only five publications from four LHS environments, all within the United States [3]. Here we have identified 43 studies from 23 LHS environments across continents and settings. We note that significant future evidence is anticipated with LHS such as the “Connected Cities UK” noting over 20 projects in the grey literature, that are yet to report. The USA has also invested $8 million annually since 2018 to build workforce capacity across 10 institutions to establish a sustainable corps of learning-health-system researchers [66]. A dedicated journal was established in 2017 to advance the interdisciplinary area of learning health systems, to enable continuous rapid healthcare improvement and transformation of organisational practices. Yet here many studies were identified through the grey literature search and reference list checks and inconsistent terminology remains a key barrier to progress. Moving towards consistent terminology would enable the capture and sharing of learnings on how to design, implement and sustain the complex system level interventions needed in an effective LHS. Furthermore learning from these large programs underway in the US and the UK will yield more learnings on effective LHS models.

The underpinnings of the LHS included electronic medical records and/or linked data and clinical registers a core data sets. Organisational structures included community of practice networks, academic health science centre partnerships, medical collaboration and commercial operations. The LHS environments producing impact identified in this review show that LHS were not homogenous entitiies and can have a range of operational scale and orginate from different origins. For example, we identified n = 5 local (eg. hospital), n = 9 regional (eg. networks of healthcare providers) and n = 9 national (eg. linked services in a country). Similarly, Menear et al. (2019) noted that the LHS could differ in scale, operating locally, regionally, nationally or even internationally [5], and implied that local or regional LHSs can evolve alongside or within broader LHSs, with linkages between LHSs or between actors at various system levels [5]. Origins ranged from clinical registries to new clinician community of practices, which then grew into the operational LHS environments.

Benefits noted included patient self-management, evidence-based clinician care, clinical organisation or system-level performance and benefits in research. To have direct health impact, a LHS must provide timely access to data as well as analysis of that data. Access to integrated real-world data is often impeded by governance and regulatory systems as well as technical, quality and interoperability issues. This review showed that these issues can be addressed within the LHS continuous improvement process, supported by strategies including natural language processing to improve data quality. The effective LHS identified in this review combined people with relevant workforce capacities and people with capabilities in analytics to make sense of the complex data arising from complex improvement cycle focuses on areas of unmet need, public interest and priorities. This was particularly evident in the service-led LHS environments including the registry-based LHS.

Core features of LHS included having strong partnerships, generating a shared vision across stakeholders, having agreed principles and governance, implemented systems and processes to enable iterative sustainable improvement, and longitudinally benchmarking and patient tracking with feedback to frontline patients, clinicians and health services. LHS environments translating data-driven evidence into clinical practice and identified in this review all confirm that a key feature to achieve this are integrated multidisciplinary teams of frontline clinicians, researchers and community members, embedded in healthcare. This is critical for the purpose of using data from clinical encounters and other sources to generate new knowledge to continuously inform and improve health decision making and practice. This is commensurate with the views of the LHS literature dating back to the earliest mentions of the LHS as a concept only one decade ago [1, 2, 10]. This review has shown that the LHS can be a successful model to create effective bridges across silos of disciplines and professions, and facilitate the creative problem solving to solve complex problems that are often faced in healthcare to produce better health outcomes [67].

Limitations

Limitations here include the heterogeneous nature of terminology used, the lack of structured descriptions of the LHS components, the varied outcomes and the need for narrative evidence synthesis. Also, only five (21.7%) LHS environments identified had produced medium-high level of evidence and all these were all in the United States. Another limitation of this review is that the majority of articles were identified following a grey-literature search of websites and other information; therefore, it is likely that there are other LHS environments that had reported impact, but used different terminology and were not captured in this review.

Future research

Moving forward, common terminology is needed and core components of LHS need to be identified and reported, along with tangible healthcare impacts. Learning on both barriers and facilitators could also be better captured to advance the field. This review focused on high income countries only and the expansion of evidence and future updates of the review could include extending to low and middle income countries [68]. Additionally, the evidence from this review could be used to assist the development of LHS in high, mid and low income countries to enable better use of data to drive healthcare improvements and deliver impacts. Finally, the COVID-19 pandemic has resulted in rapid changes inside health systems globally, particularly as systems were adapted to conduct routine non-COVID healthcare remotely and to provide optimal treatments for patients with COVID-19. The crisis and transformation occurring in healthcare over the last 12 months, is deliberately not captured here and is the subject of a separate subsequent project.

Conclusion

The wealth of currently available health data offers clear opportunities for health care improvement, however barriers to the capture, use and application of data are significant. The Learning Health System is emerging to take practice to data, data to new knowledge though analysis, knowledge to practice through translation. Here in this systematic review, we demonstrate that LHS across multiple continents and settings can generate measureable healthcare improvement. These LHS built on electronic medical records and/or linked data, clinical registers, community of practice networks, academic health science centre partnerships, medical collaboration or commercial operations. Key features include benchmarking and individual patient tracking longitudinally with outcomes readily available to patients, clinicians and health services at the point of care. Benefits included better patient self-management, improved clinician care, and optimised clinical service, organisation and/ or system-level performance and benefits to research. Core features of LHS included having strong partnerships, generating a shared vision across stakeholders, having agreed principles and governance, implemented systems and processes to enable iterative sustainable improvement, using longitudinal benchmarking. Key opportunities moving forward include harmonising terminology, capturing and sharing learnings on how to advance the LHS with greater research and evidence of translation into practice to deliver on the promise of health data to improve and transform healthcare.

Availability of data and materials

The datasets used and/or analysed during the current study available from the corresponding author on reasonable request.

References

  1. 1.

    Medicine., I.o. Roundtable on Value and Science-Driven Health Care: The Learning Health System and its Innovation Collaboratives: Update Report. Washington, DC: IOM; 2011.

    Google Scholar 

  2. 2.

    Medicine., I.o. Making a Difference: Roundtable Charter, Strategy, Tactics, Impact. Washington, DC: IOM; 2014.

    Google Scholar 

  3. 3.

    Budrionis A, Bellika JG. The learning healthcare system: where are we now? A systematic review. J Biomed Inform. 2016;64:87–92.

    PubMed  Article  Google Scholar 

  4. 4.

    McLachlan S, et al. The Heimdall framework for supporting characterisation of learning health systems. J Innov Health Inform. 2018;25(2):77–87.

    PubMed  Google Scholar 

  5. 5.

    Menear M, et al. A framework for value-creating learning health systems. Health Res Policy Syst. 2019;17(1):79.

    PubMed  PubMed Central  Article  Google Scholar 

  6. 6.

    Scobie S, Castle-Clarke S. Implementing learning health systems in the UK NHS: policy actions to improve collaboration and transparency and support innovation and better use of analytics. Learn Health Syst. 2020;4(1):e10209.

    PubMed  Google Scholar 

  7. 7.

    Teede H, et al. Australian Health Research Alliance: national priorities in data driven healthcare improvement. Med J Aust. 2019;211(11):494–7.

    PubMed  Article  Google Scholar 

  8. 8.

    Agency for Healthcare Research and Quality (AHRQ), AHRQ Pub No. 19–0052-2. 2019.

  9. 9.

    Friedman C, et al. Toward a science of learning systems: a research agenda for the high-functioning learning health system. J Am Med Inform Assoc. 2015;22(1):43–50.

    PubMed  Article  Google Scholar 

  10. 10.

    Flynn A. Informatics and technology enable us to learn from every patient: Pharmacists' many roles in learning health systems. Am J Health Syst Pharm. 2019;76(15):1095–6.

    PubMed  Article  Google Scholar 

  11. 11.

    Coury J, et al. Applying the plan-do-study-act (PDSA) approach to a large pragmatic study involving safety net clinics. BMC Health Serv Res. 2017;17(1):411.

    PubMed  PubMed Central  Article  Google Scholar 

  12. 12.

    Harvard Business Review Analytic Services., Leading a new era in health care. 2019.

    Google Scholar 

  13. 13.

    Foley JF, Vale L. What role for learning health systems in quality improvementwithin healthcare providers? Learn Health Syst. 2017;1(4):e10025.

    PubMed  PubMed Central  Article  Google Scholar 

  14. 14.

    Moher D, et al. Preferred reporting items for systematic reviews and meta-analyses: the PRISMA statement. PLoS Med. 2009;6(7):e1000097.

    PubMed  PubMed Central  Article  Google Scholar 

  15. 15.

    World Bank Group. World Bank Country and Lending Groups Country Classification. 2020 [accessed 4th July 2020]; Available from: https://datahelpdesk.worldbank.org/knowledgebase/articles/906519.

    Google Scholar 

  16. 16.

    Aoki NJ, Enticott JC, Phillips LE. Searching the literature: four simple steps. Transfusion. 2013;53(1):14–7.

    PubMed  Article  Google Scholar 

  17. 17.

    Enticott J, Braaf S, Johnson A. et al. Leaders’ perspectives on learning health systems: a qualitative study. BMC Health Serv Res. 2020;20:1087. https://doi.org/10.1186/s12913-020-05924-w.

  18. 18.

    Schünemann 2013. In: Schünemann HBJ, Guyatt G, Oxman A, editors. The GRADE Working Group. GRADE Handbook for Grading Quality of Evidence and Strength of Recommendations; 2013. [accessed 4th July 2020]; Available from: gdt.guidelinedevelopment.org/app/handbook/handbook.html.

    Google Scholar 

  19. 19.

    Smith SK, Rowe K, Abernethy AP. Use of an electronic patient-reported outcome measurement system to improve distress management in oncology. Palliat Support Care. 2014;12(1):69–73.

    PubMed  Article  Google Scholar 

  20. 20.

    Simianu VV, Kumar AS. Surgical care and outcomes assessment program (SCOAP): a nuanced, flexible platform for colorectal surgical research. Clin Colon Rectal Surg. 2019;32(1):25–32.

    PubMed  PubMed Central  Article  Google Scholar 

  21. 21.

    Marsolo K, et al. A Digital Architecture for a Network-Based Learning Health System: Integrating Chronic Care Management, Quality Improvement, and Research. eGEMs (Generating Evidence & Methods to improve patient outcomes). 2015;3(1):1168.

    Article  Google Scholar 

  22. 22.

    McLinden D, et al. The learning exchange, a community knowledge commons for learning networks: qualitative evaluation to test acceptability, feasibility, and utility. JMIR Form Res. 2019;3(1):e9858.

    PubMed  PubMed Central  Article  Google Scholar 

  23. 23.

    Pitt B, et al. Spironolactone for heart failure with preserved ejection fraction. N Engl J Med. 2014;370(15):1383–92.

    CAS  PubMed  Article  Google Scholar 

  24. 24.

    Starren JB, Winter AQ, Lloyd-Jones DM. Enabling a learning health system through a unified Enterprise data warehouse: the experience of the Northwestern University clinical and translational sciences (NUCATS) institute. Clin Transl Sci. 2015;8(4):269–71.

    PubMed  PubMed Central  Article  Google Scholar 

  25. 25.

    Wagner LI, et al. Bringing PROMIS to practice: brief and precise symptom screening in ambulatory cancer care. Cancer. 2015;121(6):927–34.

    PubMed  Article  Google Scholar 

  26. 26.

    Shah SJ, et al. Phenomapping for novel classification of heart failure with preserved ejection fraction. Circulation. 2015;131(3):269–79.

    PubMed  Article  Google Scholar 

  27. 27.

    Fung-Kee-Fung M, et al. Piloting a regional collaborative in cancer surgery using a "community of practice" model. Curr Oncol. 2014;21(1):27–34.

    CAS  PubMed  PubMed Central  Article  Google Scholar 

  28. 28.

    Fung-Kee-Fung M, et al. Regional process redesign of lung cancer care: a learning health system pilot project. Curr Oncol. 2018;25(1):59–66.

    CAS  PubMed  PubMed Central  Article  Google Scholar 

  29. 29.

    Bhandari RP, et al. Pediatric-collaborative health outcomes information registry (Peds-CHOIR): a learning health system to guide pediatric pain research and treatment. Pain. 2016;157(9):2033–44.

    PubMed  PubMed Central  Article  Google Scholar 

  30. 30.

    Briscoe MB, Carlisle B, Cerfolio RJ. Data-driven collaboration: how physicians and administration can team up to improve outcomes. Healthc Financ Manage. 2016;70(7):42–9.

    PubMed  Google Scholar 

  31. 31.

    Tardif H, et al. Establishment of the Australasian electronic persistent pain outcomes collaboration. Pain Med. 2017;18(6):1007–18.

    PubMed  Google Scholar 

  32. 32.

    Laws R, et al. The Community Health Applied Research Network (CHARN) Data Warehouse: a Resource for Patient-Centered Outcomes Research and Quality Improvement in Underserved, Safety Net Populations. eGEMs (Generating Evidence & Methods to improve patient outcomes). 2014;2(3):1097.

    Article  Google Scholar 

  33. 33.

    Vargas N, et al. Qualitative perspective on the learning health system: how the community health applied research network paved the way for research in safety-net settings. Prog Community Health Partnersh. 2018;12(3):329–39.

    PubMed  Article  Google Scholar 

  34. 34.

    Vo MT, et al. Prompting patients with poorly controlled diabetes to identify visit priorities before primary care visits: a pragmatic cluster randomized trial. J Gen Intern Med. 2019;34(6):831–8.

    PubMed  PubMed Central  Article  Google Scholar 

  35. 35.

    Ransom C, et al. Correspondance: Understanding cognitive barriers to safer prescribing for frail patients. Clinical Pharmacist, CP, May 2018, Vol 10, No 5;10(5):https://doi.org/10.1211/PJ.2018.20204794

  36. 36.

    Wade SL, et al. Long-term behavioral outcomes after a randomized, clinical trial of counselor-assisted problem solving for adolescents with complicated mild-to-severe traumatic brain injury. J Neurotrauma. 2015;32(13):967–75.

    PubMed  PubMed Central  Article  Google Scholar 

  37. 37.

    Lowenstein LM, et al. Randomized trial of a patient-centered decision aid for promoting informed decisions about lung cancer screening: implementation of a PCORI study protocol and lessons learned. Contemp Clin Trials. 2018;72:26–34.

    PubMed  PubMed Central  Article  Google Scholar 

  38. 38.

    Arterburn D, et al. Comparative effectiveness and safety of bariatric procedures for weight loss: a PCORnet cohort study. Ann Intern Med. 2018;169(11):741–50.

    PubMed  PubMed Central  Article  Google Scholar 

  39. 39.

    Toh S, et al. The National Patient-Centered Clinical Research Network (PCORnet) bariatric study cohort: rationale, methods, and baseline characteristics. JMIR Res Protoc. 2017;6(12):e222.

    PubMed  PubMed Central  Article  Google Scholar 

  40. 40.

    Eriksson JK, Askling J, Arkema EV. The Swedish Rheumatology Quality Register: optimisation of rheumatic disease assessments using register-enriched data. Clin Exp Rheumatol. 2014;32(5 Suppl 85):S-147-9.

    PubMed  Google Scholar 

  41. 41.

    Neovius M, et al. Drug survival on TNF inhibitors in patients with rheumatoid arthritis comparison of adalimumab, etanercept and infliximab. Ann Rheum Dis. 2015;74(2):354–60.

    CAS  PubMed  Article  Google Scholar 

  42. 42.

    Ovretveit J, Nelson E, James B. Building a learning health system using clinical registers: a non-technical introduction. J Health Organ Manag. 2016;30(7):1105–18.

    PubMed  Article  Google Scholar 

  43. 43.

    Cox ED, et al. A Family-Centered Rounds Checklist, Family Engagement, and Patient Safety: A Randomized Trial. Pediatrics. 2017 May;139(5):e20161688.  https://doi.org/10.1542/peds.2016-1688.

  44. 44.

    Bartels CM, et al. Connecting rheumatology patients to primary Care for High Blood Pressure: specialty clinic protocol improves follow-up and population blood pressures. Arthritis Care Res. 2019;71(4):461–70.

    Article  Google Scholar 

  45. 45.

    Koslov S, et al. Across the divide: "primary care departments working together to redesign care to achieve the triple aim". Healthc (Amst). 2016;4(3):200–6.

    Article  Google Scholar 

  46. 46.

    Kraft S, et al. Building the learning health system: describing an organizational structure to support continuous learning. Learn Health Syst. 2017;1:e10034.

    PubMed  PubMed Central  Article  Google Scholar 

  47. 47.

    Fife CE. How Should Outpatient Wound Clinics Honestly Measure Success? Todays Wound Clin. 2018;12(4). https://www.todayswoundclinic.com/articles/how-should-outpatient-wound-clinics-honestly-measure-success.

  48. 48.

    Serena TE, et al. A new approach to clinical research: integrating clinical care, quality reporting, and research using a wound care network-based learning healthcare system. Wound Repair Regen. 2017;25(3):354–65.

    PubMed  Article  Google Scholar 

  49. 49.

    Gramlich LM, et al. Implementation of enhanced recovery after surgery: a strategy to transform surgical care across a health system. Implement Sci. 2017;12(1):67.

    PubMed  PubMed Central  Article  Google Scholar 

  50. 50.

    Noseworthy T, Wasylak T, O'Neill B. Strategic clinical networks in Alberta: structures, processes, and early outcomes. Healthc Manage Forum. 2015;28(6):262–4.

    PubMed  Article  Google Scholar 

  51. 51.

    Psek WA, et al. Operationalizing the learning health care system in an integrated delivery system. eGEMs (Generating Evidence & Methods to improve patient outcomes). 2015;3(1):1122.

    Article  Google Scholar 

  52. 52.

    Schechter MS, et al. The Cystic Fibrosis Foundation patient registry as a tool for use in quality improvement. BMJ Qual Saf. 2014;23(Suppl 1):i9–14.

    PubMed  Article  Google Scholar 

  53. 53.

    Liu VX, et al. Data that drive: closing the loop in the learning hospital system. J Hosp Med. 2016;11(Suppl 1):S11–7.

    CAS  PubMed  PubMed Central  Article  Google Scholar 

  54. 54.

    Liu VX, et al. Multicenter implementation of a treatment bundle for patients with Sepsis and intermediate lactate values. Am J Respir Crit Care Med. 2016;193(11):1264–70.

    CAS  PubMed  PubMed Central  Article  Google Scholar 

  55. 55.

    Pace WD, et al. The DARTNet Institute: Seeking a Sustainable Support Mechanism for Electronic Data Enabled Research Networks. eGEMs (Generating Evidence & Methods to improve patient outcomes). 2014;2(2):1063

  56. 56.

    Wallace PJ, et al. Optum labs: building a novel node in the learning health care system. Health Aff (Millwood). 2014;33(7):1187–94.

    Article  Google Scholar 

  57. 57.

    Lowes LP, et al. 'Learn from every Patient': implementation and early results of a learning health system. Dev Med Child Neurol. 2017;59(2):183–91.

    PubMed  Article  Google Scholar 

  58. 58.

    Noritz G, et al. "Learn From Every Patient": How a Learning Health System Can Improve Patient Care. Pediatr Qual Saf. 2018;3(5):e100.

    PubMed  PubMed Central  Article  Google Scholar 

  59. 59.

    Moffatt-Bruce S, et al. IDEA4PS: the development of a research-oriented learning healthcare system. Am J Med Qual. 2018;33(4):420–5.

    PubMed  Article  Google Scholar 

  60. 60.

    Rayo MF, et al. Implementing an institution-wide quality improvement policy to ensure appropriate use of continuous cardiac monitoring: a mixed-methods retrospective data analysis and direct observation study. BMJ Qual Saf. 2016;25(10):796–802.

    PubMed  Article  Google Scholar 

  61. 61.

    Griffiths and Kapacee. Enabling data flows in Greater Manchester Connected Health City: Connected Health Cities UK; 2019.

  62. 62.

    Wade SL, Kurowski BG. Behavioral clinical trials in moderate to severe pediatric traumatic brain injury: challenges, potential solutions, and lessons learned. J Head Trauma Rehabil. 2017;32(6):433–7.

    PubMed  PubMed Central  Article  Google Scholar 

  63. 63.

    Noseworthy T, Wasylak T, O'Neill BJ. Strategic clinical networks: Alberta's response to triple aim. Healthc Pap. 2016;15(3):49–54.

    PubMed  Google Scholar 

  64. 64.

    Hvitfeldt H, et al. Feed forward systems for patient participation and provider support: adoption results from the original US context to Sweden and beyond. Qual Manag Health Care. 2009;18(4):247–56.

    PubMed  Article  Google Scholar 

  65. 65.

    Zurynski Y, et al. White paper - Mapping the Learning Health System: A Scoping Review of Current Evidence, vol. 2020. Sydney, Australia: 2020: Australian Institute of Health Innovation, and the NHRMC partnership Centre for health system sustainability.

  66. 66.

    Blog, P. Funding the Next Generation of Learning-Health-System Researchers. 2017 [cited 18th Sep 2019]; Available from: https://www.pcori.org/blog/funding-next-generation-learning-health-system-researchers.

  67. 67.

    Melder A, et al. An overview of healthcare improvement: unpacking the complexity for clinicians and managers in a learning health system. IMJ. 2020;50(10):1174–84.

    Google Scholar 

  68. 68.

    Lessard L, et al. Type and use of digital technology in learning health systems: a scoping review protocol. BMJ Open. 2019;9(5):e026204.

    PubMed  PubMed Central  Article  Google Scholar 

Download references

Acknowledgments

The authors would like to thank Monash Partners COO, Angela Jones, for her expertise in the Monash Partners Learning Health System Framework. The authors would also like to thank Monash Partners Data Driven Committee members for their insights on where to locate relevant literature. We also want to thank the Monash University librarian, Anne Younge, for her expertise and advice when developing the search strategy.

Funding

This research received funding from Monash Partners Academic Health Sciences Centres from partner health services and from the Australian Government Medical Research Future Fund. HT is funded by an NHMRC MRFF fellowship and JE is funded on a Monash Partners fellowship.

Author information

Affiliations

Authors

Contributions

HT was involved in conceptualising, obtaining funding, co-designing the systematic review and writing the manuscript. AJ was involved in conceptualising, co-designing the systematic review, devising the search strategy, screening of articles and writing the manuscript. JE devised and undertook the searches, and with AJ screened all articles. JE extracted data from the included publications, identified the level of evidence, as well as drafting and writing the manuscript. The author(s) read and approved the final manuscript.

Corresponding authors

Correspondence to Joanne Enticott or Helena Teede.

Ethics declarations

Ethics approval and consent to participate

As this review used publically available information, it was unnecessary to obtain approval from an ethics committee.

Consent for publication

Not applicable.

Competing interests

Not applicable.

Additional information

Publisher’s Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Supplementary Information

Rights and permissions

Open Access This article is licensed under a Creative Commons Attribution 4.0 International License, which permits use, sharing, adaptation, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons licence, and indicate if changes were made. The images or other third party material in this article are included in the article's Creative Commons licence, unless indicated otherwise in a credit line to the material. If material is not included in the article's Creative Commons licence and your intended use is not permitted by statutory regulation or exceeds the permitted use, you will need to obtain permission directly from the copyright holder. To view a copy of this licence, visit http://creativecommons.org/licenses/by/4.0/. The Creative Commons Public Domain Dedication waiver (http://creativecommons.org/publicdomain/zero/1.0/) applies to the data made available in this article, unless otherwise stated in a credit line to the data.

Reprints and Permissions

About this article

Verify currency and authenticity via CrossMark

Cite this article

Enticott, J., Johnson, A. & Teede, H. Learning health systems using data to drive healthcare improvement and impact: a systematic review. BMC Health Serv Res 21, 200 (2021). https://doi.org/10.1186/s12913-021-06215-8

Download citation

Keywords

  • Health services research
  • Learning health systems
  • Health data hubs
  • Digital health