Skip to main content
  • Research article
  • Open access
  • Published:

How well does the minimum data set measure healthcare use? a validation study

Abstract

Background

To improve care, planners require accurate information about nursing home (NH) residents and their healthcare use. We evaluated how accurately measures of resident user status and healthcare use were captured in the Minimum Data Set (MDS) versus administrative data.

Methods

This retrospective observational cohort study was conducted on all NH residents (N = 8832) from Winnipeg, Manitoba, Canada, between April 1, 2011 and March 31, 2013. Six study measures exist. NH user status (newly admitted NH residents, those who transferred from one NH to another, and those who died) was measured using both MDS and administrative data. Rates of in-patient hospitalizations, emergency department (ED) visits without subsequent hospitalization, and physician examinations were also measured in each data source. We calculated the sensitivity, specificity, positive and negative predictive values (PPV, NPV), and overall agreement (kappa, κ) of each measure as captured by MDS using administrative data as the reference source. Also for each measure, logistic regression tested if the level of disagreement between data systems was associated with resident age and sex plus NH owner-operator status.

Results

MDS accurately identified newly admitted residents (κ = 0.97), those who transferred between NHs (κ = 0.90), and those who died (κ = 0.95). Measures of healthcare use were captured less accurately by MDS, with high levels of both under-reporting and false positives (e.g., for in-patient hospitalizations sensitivity = 0.58, PPV = 0.45), and moderate overall agreement levels (e.g., κ = 0.39 for ED visits). Disagreement was sometimes greater for younger males, and for residents living in for-profit NHs.

Conclusions

MDS can be used as a stand-alone tool to accurately capture basic measures of NH use (admission, transfer, and death), and by proxy NH length of stay. As compared to administrative data, MDS does not accurately capture NH resident healthcare use. Research investigating these and other healthcare transitions by NH residents requires a combination of the MDS and administrative data systems.

Peer Review reports

Background

Nursing homes (NHs) have become increasingly complex care environments and are in growing demand. The number of 75+ year olds living in both Canada [1] and the United States [2] is expected to double by 2030, and in any given year 1.4 million U.S. citizens [3] and 225,000 Canadians [4] reside in a NH. NH use patterns have also changed dramatically in recent years, with many residents now admitted later in life and with more complex needs [5,6,7]. Presently, about half of all newly admitted NH residents require weight-bearing help to complete activities of daily living [8], and many are afflicted with co-morbid diseases plus a range of cognitive, behavior, and continence challenges [7, 9, 10]. Rates of healthcare use (e.g., emergency department [ED] visits, hospitalizations) by NH residents have also increased considerably [11, 12], and better management of these transitions has become a major reform focus [13, 14].

Continued research into these and related areas requires high quality data systems. While some authors report challenges using administrative data to measure patient health status and adverse events (e.g., hospital acquired pressure ulcers) [15], several have shown that these records accurately capture many types of healthcare use without recall bias or loss to follow-up [16,17,18,19]. In some instances, however, the breadth of administrative records available and access to them for research varies by geography. Conversely, standardized comprehensive assessments using the interRAI family of instruments is mandated widely across North America [20, 21]. Amongst these, the Resident Assessment Instrument 2.0 (RAI 2.0) exists for NH residents in most Canadian provinces and is comprised of assessment forms (called the Minimum Data Set; MDS), a standard operating manual (provides definitions and assessment guidelines), and Clinical Assessment Protocols (CAPs, used to help develop resident care plans) [20]. MDS assessments richly define NH residents by (for example) their functional dependence, cognitive impairment, and use of healthcare services including prescription drug use. Validation has been conducted on various MDS scales that measure cognitive performance [22], pain [23], and depression [24]. Authors have also examined the accuracy of MDS quality care metrics [25], plus specific MDS items that capture resident prescription drug use [26, 27] and chronic disease [28, 29]. With some exceptions [30], the accuracy with which MDS captures resident healthcare use is understudied. The present research helps to fill this gap by linking MDS to administrative data for a population of NH residents from Winnipeg, Manitoba. The objective of our study is to assess the accuracy of MDS for capturing: i) NH resident admission, transfers between NHs, and death; and ii) hospital use, ED transfers, and ambulatory care physician visits. These study results define the extent to which MDS data can be used as a stand-alone tool for providing key NH use and healthcare transition information pertinent to guiding NH care reform.

Methods

Research environment and study cohort

Manitoba is one of 10 Canadian provinces and has a population of 1.3 million people dispersed across five geographically diverse regions responsible for delivering healthcare. Four of these regions are rural or remote and the Winnipeg Health Region (WHR, the city of Winnipeg) is the only large metropolitan area (population 725,000). About two-thirds of Manitoba’s total NH beds (N = 9586) are located in Winnipeg (N = 5636 beds across 38 facilities) [31]. This study was conducted on the population of Manitobans who resided in a Winnipeg NH for a least 1 day between April 1, 2011 and March 31, 2013.

Data sources

The data sources used to conduct this research are described in Table 1. The national standard for reporting MDS data has been established by the Canadian Institute of Health Information, CIHI; https://www.cihi.ca/en). All NH residents are required to have an MDS admission (provides an admission date and an ‘admission from’ location) and discharge (provides the reason and date of discharge) record. A full-length assessment is also required for each resident at NH admission and annually thereafter, interspersed by abbreviated quarterly assessments. Each full assessment contains responses to about 400 standardized items that profile residents by various clinical (e.g., cognitive performance) and healthcare use (e.g., emergency department use) domains. Each assessment is completed by a trained assessor (usually a nurse) using all available information including clinical charts and observations made by the family, staff, physicians, and volunteers.

Table 1 Data Sources and Methods Used to Create Study Measures

Administrative data have been available in Winnipeg since 1984, although some data sources originate before this date. The Registry File contains a unique identifier for every Manitoban including their birth and death date, and is used for linkage to all other files. The NH Use File contains the dates of admission and discharge for every NH resident in Manitoba, while the Hospital Discharge Abstract Database (DAD) provides the dates of hospital admission and discharge plus reasons for hospitalization. The Emergency Department Information System (EDIS) provides the dates of ED visits plus details about the care provided during each visit, and the Medical Claims File provides the dates and types of physician visits as well as the primary reason for the visit.

The standards for capturing data in most administrative files are managed centrally by the Government of Manitoba. Certified health information management professionals are responsible for creating all DAD abstracts, and the Registry File receives weekly birth and death updates from Vital Statistics. Re-abstracting studies completed by CIHI conclude that DAD data are of particular high quality in Manitoba, especially for capturing patient diagnoses and procedures [32]. Previous research has demonstrated that the Medical Claims File provides diagnostic and procedural data that are highly comparable to other information sources [16]. Researchers have also shown that 97.5% of ED patients recorded as being hospitalized in EDIS were found in the DAD on the same day [33].

Both MDS and administrative data are housed at the Manitoba Centre for Health Policy, Max Rady College of Medicine, University of Manitoba. Unique, person-level and time-stamped identifiers exist in each of these files, enabling researchers to link them when studying system-level patterns of healthcare use. Further details about general data linkage processes are available elsewhere [17, 18].

Measures of NH user status and healthcare use

The study measures are described in Table 1. Measures of NH user status include being newly (i.e., first time) admitted into the NH system, having transferred between NH facilities one or more times, and having died during the study period. For residents defined as newly admitted in both data systems, we also measured whether they were admitted directly from hospital or another location.

Measures of healthcare use include in-patient hospitalizations, ED visits not resulting in hospitalization, and ambulatory care physician visits. These measures were assessed during specific times (episodes) as predicated by the MDS assessment (called ‘look-back’ episodes in this manuscript). During each full MDS assessment, assessors are asked to record the number of times the resident was admitted to a hospital in the last 90 days, and to curtail this look-back episode if a previous full assessment occurred during this time. Based on these criteria, the equivalent date-stamped episodes were created using DAD (with 2 days of overlap to account for data recording errors), and the number of hospital discharges during each episode was compared between data systems. Also, some of these look-back episodes include days preceding a resident’s NH admission date. Comparisons of hospital use across data systems were therefore made separately on assessments completed within the first 90 days of NH admission (i.e., where at least some of the ‘look-back’ episode preceded the resident’s stay in a given NH), and all others.

This same strategy was used to identify ED visits not resulting in hospitalization (Table 1). Counts of these visits are recorded during full MDS assessments only. Assessors are asked to record the number of these visits in the last 90 days, curtailed if previous full assessments occurred during this time. The equivalent time-stamped episodes (plus or minus 2 days to account for error) were created using EDIS. Counts of ED visits were compared across data systems during each episode, stratified by assessments completed within or following each resident’s first 90 days of NH stay.

Physician visits are captured in MDS during both full and quarterly assessments. Assessors recorded how often residents were examined by a physician (or a related care provider such as a nurse practitioner or dentist with additional training) in the past 2 weeks. Tariff codes from the Medical Claims file were used to identify a subset of physician visits (i.e., ambulatory care or examinations in the NH) that occurred during each MDS look-back episode, and a list of the major tariff codes used to identify ambulatory care physician visits is provided in Table 1. Comparisons across data systems identify the number of residents who were examined during zero versus one or more days during each look-back episode.

Statistical analysis

The study cohort was first described by the number of different MDS records (i.e., assessment and discharge records, full and quarterly assessments) that existed; and then by resident age (< 65, 65–74, 75–84, 85+ years), sex, and whether people lived in a NH that was not-for-profit, for profit, or a combination of these ownership types during the study period. Measures of NH user status and healthcare use were then compared between the MDS and administrative data systems, using the latter as the reference data source. The unit of analysis for NH user status measures is the person, while the unit of analysis for measures of health care use is the MDS look-back episode.

The accuracy of each study measure was calculated. Sensitivity was defined as proportion of true cases correctly identified by MDS, while specificity was defined as the proportion of true negatives correctly identified by MDS. Positive predictive value (PPV) was defined as the proportion of all cases defined by MDS that were truly positive, while negative predictive value (NPV) was defined as the proportion of all non-cases in MDS that were truly negative. Cohen’s kappa coefficient (κ) [34] was used to measure overall agreement between these data sources. As recommended by Cohen, kappa values can be categorized as: i) < 0.20 (poor); ii) 0.20 to 0.39 (fair); iii) 0.40 to 0.59 (moderate); iv) 0.60 to 0.79 (good); and v) 0.80 to 1.0 (very good).

Multiple logistic regression was conducted on each study measure to determine if the level of disagreement between data systems varied by resident age and sex, plus NH owner-operator type. For each model, residents who were similarly identified by each data system (e.g., newly admitted by both MDS and administrative records) were included in the agreement category (reference group), while those who were dis-similarly identified were included in the disagreement category. Model estimates are presented as adjusted odds ratios (AORs) with 95% confidence limits (95% CLs). Given the hierarchical nature of these data, a random intercept was included for each NH to account for the clustering of residents within facilities. Using the strategies outlined by Ene (2014), the intraclass correlation coefficient (ICC) was computed to determine what proportion of the total variation in study outcomes was accounted for by NH facilities [35]. Concordance (C) statistics were calculated to measure how well each model discriminated across agreement categories. This value ranges from 0.5 (no discrimination) to 1.0 (perfect discrimination). All analyses were conducted using SAS version 9.4.

Results

The study cohort included 8832 NH residents with 52,818 MDS records (Table 2). A small number of residents (N = 137) had no MDS data, while just under half had at least one admission (41.4%) and discharge (46.2%) record. Full MDS assessments were completed on 90.5% of residents, and 60.5% of residents had one or more quarterly assessments.

Table 2 Counts of MDS Records and Characteristics of the Study Cohort

The distribution of MDS records was similar by resident age and sex (Table 2); 56.1% of the study cohort was 85+ years old and 55.1% of MDS records were completed on these residents. Similarly, 4.2% of the study cohort was younger than 65 years old and 4.6% of MDS records were completed on these people. Also, 69.5% of the study cohort was female and 70.4% of MDS records were completed for females. Similar results exist for NH owner-operator status (e.g., 36.8% of the study cohort resided in a for-profit NH during the study period and 33.7% of all MDS assessments were completed on residents who resided in these NHs).

Administrative data revealed that 37.5% of NH residents were newly admitted during the study period, while MDS revealed that 36.9% of residents were newly admitted during this time (Fig. 1; contingency tables showing denominators are provided in Additional file 1). Similarly, 37.1 and 35.4% of residents were reported to have died during the study period, using administrative and MDS data, respectively. Administrative data revealed that 14.0% of residents transferred between NHs at least once during the study period, versus 12.2% of residents as defined using MDS.

Fig. 1
figure 1

Nursing Home User Status and Episodes of Healthcare Use, by Data Source

Counts of healthcare use between data systems are also reported (Fig. 1). Residents were reported to have not been hospitalized during 95.9 and 94.8% of look-back episodes, using the administrative and MDS data, respectively. Similarly, administrative data revealed that residents did not have ED visits ending in hospitalization during 92.4% of look-back episodes, versus 95.5% of look-back episodes reported by MDS. Residents were not examined by a physician during 51.6 and 60.0% of look-back episodes, as reported using administrative and MDS data, respectively. MDS also underreported the frequency of physician examinations during these times (i.e., residents were examined by physician on multiple days during 8.0 and 4.2% of all look-back periods, using administrative and MDS data, respectively).

Comparisons of these study outcomes were examined in more detail (Table 3; Additional file 1). Measures of sensitivity, specificity, PPV, and NPV values were all above 0.86 for measures of NH user status, and in most instances these values were greater than 0.96. Kappa (κ) values ranged from 0.90 (NH transfers) to 0.97 (newly admitted residents), indicating very high levels of overall agreement. For residents identified as newly admitted in both systems, MDS also accurately defined residents who transferred into NH from hospital versus from other locations (e.g., κ=0.74). The median (10th, 90th percentile) difference in time between NH admission dates in these systems was 0 (0, 7) days (data not shown). Similarly, for residents who were reported as dying in both systems, the median (10th, 90th percentile) difference in time between death dates was 0 (0, 0) days (data not shown).

Table 3 Levels of Agreement Between MDS and Administrative Health Care Use Records; Value (95% Confidence Limits)

As compared to administrative data, measures of healthcare use in MDS are characterized by high levels of specificity (e.g., 0.97 for hospitalizations) and NPV (e.g., 0.97 for ED visits not ending in hospitalization). However, considerable underreporting and false positives in MDS occurred (Table 3; Additional file 1). MDS correctly identified hospitalizations during only 58% of the look-back episodes (i.e., sensitivity = 0.58) and conversely, often over-reported hospitalizations (PPV = 0.45). Similarly, during periods where agreement between data systems was found, MDS both underestimated (sensitivity = 0.60) and over-reported (PPV = 0.41) how often multiple hospitalizations occurred. This same pattern of results exists for non-hospitalized ED visits (sensitivity = 0.34, PPV = 0.57) and days of physician examinations (sensitivity = 0.51, PPV = 0.71). Kappa values were correspondingly low for all of these measures, ranging from 0.24 for days of physician examinations to 0.49 for in-patient hospitalizations. These results for hospitalizations and ED visits exclude assessments (N = 4354) completed within the first 90 days of NH admission. Outcomes on this subset of assessments were marginally worse than those reported in Table 4 (hospital admissions: Sens = .41, Spec = .96, PPV = .59, NPV = .93, κ=.43; ED visits without hospitalization: Sens = .29, Spec = .97, PPV = .49, NPV = .93, κ=.32).

Table 4 Adjusted Odds Ratios (95% Confidence Limits) Comparing the Disagreement between MDS and Administrative Data

AORs (Table 4) show that the odds of disagreement between data systems was consistent for some but not all residents. Disagreement was significantly lower for residents 85+ years versus < 65 years when defining newly admitted residents (AOR = 0.3; 95% CL = 0.2, 0.7), residents with one or more NH transfers (AOR = 0.3; 95% CL = 0.2, 0.6), and hospitalizations (AOR = 0.7; 95% CL = 0.5, 1.0). The odds of disagreement was significantly higher for males when measuring NH transfers (AOR = 1.6; 95% CL = 1.1, 2.1) and resident death (AOR = 1.5; 95% CL = 1.1, 2.0). In some instances (new admission, NH transfers, in-patient hospitalization), disagreement between data systems was significantly higher for residents who resided in a for-profit versus not-for-profit NH at some time during the study period. ICC values ranged from 0.03 (days of physician examination and hospital inpatient visits) to 0.22 (newly admitted residents and their transfer from location). C-statistic values were 0.60 and lower for all outcomes except when defining newly admitted residents and those with one or more NH transfer, indicating that model covariates did not effectively discriminate between agreement categories.

Discussion

This study investigates how accurately key measures of NH user status and healthcare use were captured in the InterRAI Minimum Data Set (MDS) assessment instrument. Our results show that MDS accurately defined NH residents by their admission and death dates (and therefore length of stay), and also accurately identified newly admitted NH residents who transitioned from hospital and who transferred between NH facilities. However, as compared to administrative data, MDS both undercounted the true rate of healthcare use and provided false positives. Using Cohen’s agreement criteria [34], kappa values for these time specific comparisons were fair when measuring physician visits and non-hospitalized ED visits, and moderate when measuring in-patient hospitalizations.

These results are supported by some existing literature. Mor et al. (2011) report that MDS accurately defines resident death (sensitivity = 0.84, PPV = 0.95) as compared to administrative data [30], but show that 20% of hospitalizations reported in MDS were not found in administrative records. Similarly, Lix et al. (2015) show that MDS records accurately defined anti-psychotic and anti-depressant drug users, but observed only moderate agreement with administrative data (k = 0.40) when identifying anti-anxiety or hypnotic drug users [27]. Lix et al. (2014) also report that MDS records (versus administrative data) accurately define NH residents with some (i.e., those with diabetes) but not all types (e.g., congestive heart failure, COPD, stroke) of chronic disease, and often with high rates of both under-reporting and false positives [28]. In their literature review, Hutchinson et al. (2010) report that MDS quality indicators are captured with varying degrees of accuracy, also with under-reporting noted for some metrics and over-reporting noted for others [25]. The present research supports and extends these findings from the existing literature, by showing that MDS can be used to accurately measure NH user status at the population-level, but not necessarily to measure these residents’ healthcare use. As noted by Hirdes et al. (2013), MDS assessments completed using some software systems may be ‘auto-populated’ with responses from the previous assessment, and failure to amend these automatic updates may propagate false positive responses [20].

Accurate data systems are needed to inform NH care innovations. Several authors have investigated the consequences of transferring into NH from hospital [36, 37], have profiled the care needs of newly admitted NH residents [8, 38], and have studied how community-based housing options have impacted NH use [39]. Continued research in these and related areas is feasible using MDS as a stand-alone data source, and have value especially when combining the clinical with user status data available in this tool. Similarly, some authors have defined the extent to which NH bed supply varies by geography [40], and MDS can be used to investigate some of the key consequences (e.g., differences in resident profiles, length of NH stay) of these difference healthcare approaches. Additional research needed to support innovation includes measuring the rate and type of emergency department visits made by NH residents [41], comparing these transition rates to assisted living residents, and defining hospitalizations preceding NH resident death [11, 42]. Continued research in these and related areas requires a combination of the MDS and administrative data systems.

While population-based, the present research was conducted using data from a single urban health region. Just as NH use patterns and resident clinical profiles vary across Canada [7], so too may the results of the present research. In addition to housing MDS records, the CIHI national repository houses administrative data capturing some measures of healthcare use defined in this research. We recommend that the present study be replicated more broadly using CIHI and with jurisdictional comparisons.

Our study results for healthcare use do not account for the nested nature of assessments within residents, and hence the confidence limits for these measures may be too narrow. To partially test this hypothesis, ED use was compared between data sources using one full MDS assessment selected per resident. Results were similar to that shown in the present study (data not shown). Also, while we recognize that the definition of physician visits is broader in MDS (which includes examinations provided by nurse practitioners and dentists with additional training) than in Medical Claims (ambulatory physician only), during the time of this study few nurse practitioners worked in Winnipeg NHs, and dentists do not provide regular NH care in this region. Lastly, our comparisons of healthcare use are confined to the MDS look-back episodes as defined in this research. While MDS collects healthcare use data intermittently, administrative data captures these records continuously. Given these different approaches, MDS considerably underestimates the overall (i.e., daily) measures of healthcare use during the study period; 45.4% versus 17.6% of residents were hospitalized at least once according to the administrative and MDS data, respectively; 51.6% versus 12.3% of residents had one or more ED visits not ending in hospitalization; and 97.6% versus 77.3% of residents were examined by a physician (data not shown).

Conclusions

This study examines how accurately measures of NH user status and healthcare use are captured in the MDS instrument as compared to administrative data. Our results show that MDS accurately defined key NH user characteristics. However, as compared to administrative data, MDS records both undercounted the true rate of key healthcare use measures and often provided false positives. Future studies measuring these healthcare use patterns require MDS linkage to administrative data.

Abbreviations

AORs:

Adjusted odds ratios

CIHI:

Canadian Institute of Health Information

CLs:

Confidence limits CLs

COPD:

Chronic Obstructive Pulmonary Disease

DAD:

Hospital Discharge Abstract Database

ED:

Emergency department

EDIS:

Emergency Department Information System

ICC:

Intraclass correlation coefficient

MDS:

Minimum Data Set

NH:

Nursing home

NPV:

Negative predictive values

PPV:

Positive predictive values

WHR:

Winnipeg health region

References

  1. Statistics Canada. Table 052–0005 - Projected population, by projection scenario, age and sex, as of July 1, Canada, provinces and territories, annual (persons X1,000). 2014 http://www5.statcan.gc.ca/cansim/a26?lang=eng&id=520005 Accessed 11 Jul 2017.

  2. Vincent GK, Velkoff VA. The next four decades: The older population in the United States, 2010–2050. 2010. https://www.census.gov/prod/2010pubs/p25-1138.pdf Accessed 17 Jul 2017.

  3. Harris-Kojetin L, Sengupta M, Park-Lee E, Valverde R. Long-term Care Services in the United States: 2013 overview. Vital Health Stat 3. 2013:1–107.

  4. Statistics Canada. Living arrangements of seniors - families, households and marital status structural type of dwelling collectives, 2011 census of Population 2012 Sep. http://www12.statcan.gc.ca/census-recensement/2011/as-sa/98-312-x/98-312-x2011003_4-eng.pdf Accessed 11 Jul 2017.

  5. Berta W, Laporte A, Zarnett D, Valdmanis V, Anderson G. A pan-Canadian perspective on institutional long-term care. Health Policy. 2006;79:175–94.

    Article  PubMed  Google Scholar 

  6. Doupe M, Fransoo R, Chateau D, Dik N, Burchill C, Soodeen R-A, et al. Population aging and the continuum of older adult Care in Manitoba. Winnipeg: Manitoba Centre for Health Policy; 2011. http://mchp-appserv.cpe.umanitoba.ca/reference/LOC_Report_WEB.pdf Accessed 29 May 2013

    Google Scholar 

  7. Hirdes JP, Mitchell L, Maxwell CJ, White N. Beyond the “iron lungs of gerontology”: using evidence to shape the future of nursing homes in Canada. Can J Aging. 2011;30:371–90.

    Article  PubMed  Google Scholar 

  8. Doupe M, St John P, Chateau D, Strang D, Smele S, Bozat-Emre S, et al. Profiling the multidimensional needs of new nursing home residents: evidence to support planning. J Am Med Dir Assoc. 2012;13:487.e9–17.

    Article  Google Scholar 

  9. Doupe M, Brownell M, St John P, Strang DG, Chateau D, Dik N. Nursing home adverse events: further insight into highest risk periods. J Am Med Dir Assoc. 2011;12:467–74.

    Article  PubMed  Google Scholar 

  10. Gruber-Baldini AL, Stuart B, Zuckerman IH, Hsu VD, Boockvar KS, Zimmerman S, et al. Sensitivity of nursing home cost comparisons to method of dementia diagnosis ascertainment. Int J Alzheimers Dis. 2009;2009:1–10.

    Article  Google Scholar 

  11. Grabowski DC, Stewart KA, Broderick SM, Coots LA. Predictors of nursing home hospitalization: a review of the literature. Med Care Res Rev. 2008;65:3–39.

    Article  PubMed  Google Scholar 

  12. Carron P-N, Mabire C, Yersin B, Bula C. Nursing home residents at the emergency department: a 6-year retrospective analysis in a Swiss academic hospital. Intern Emerg Med. 2017;12:229–37.

    Article  PubMed  Google Scholar 

  13. Jansen I. Residential long-term care: public solutions to access and quality problems. Healthc Pap. 2011;10:8–22.

    Article  PubMed  Google Scholar 

  14. Kessler C, Williams MC, Moustoukas JN, Pappas C. Transitions of care for the geriatric patient in the emergency department. Clin Geriatr Med. 2013;29:49–69.

    Article  PubMed  Google Scholar 

  15. Meddings J. Using administrative discharge diagnoses to track hospital-acquired pressure ulcer incidence--limitations, links, and leaps. Jt Comm J Qual Patient Saf. 2015;41:243–5.

    Article  PubMed  Google Scholar 

  16. Roos LL, Gupta S, Soodeen R-A, Jebamani L. Data quality in an information-rich environment: Canada as an example. Can J Aging. 2005;24(Suppl 1):153–70.

    Article  PubMed  Google Scholar 

  17. Roos LL, Menec V, Currie RJ. Policy analysis in an information-rich environment. Soc Sci Med. 2004;58:2231–41.

    Article  PubMed  Google Scholar 

  18. Jutte DP, Roos LL, Brownell MD. Administrative record linkage as a tool for public health research. Annu Rev Public Health. 2011;32:91–108.

    Article  PubMed  Google Scholar 

  19. Brownell MD, Jutte DP. Administrative data linkage as a tool for child maltreatment research. Child Abuse Negl. 2013;37:120–4.

    Article  PubMed  Google Scholar 

  20. Hirdes JP, Poss JW, Caldarelli H, Fries BE, Morris JN, Teare GF, et al. An evaluation of data quality in Canada’s continuing care reporting system (CCRS): secondary analyses of Ontario data submitted between 1996 and 2011. BMC Med Inform Decis Mak. 2013;13:27.

    Article  PubMed  PubMed Central  Google Scholar 

  21. Zinn J, Feng Z, Mor V, Intrator O, Grabowski D. Restructuring in response to case mix reimbursement in nursing homes: a contingency approach. Health Care Manag Rev. 2008;33:113–23.

    Article  Google Scholar 

  22. Hartmaier SL, Sloane PD, Guess HA, Koch GG, Mitchell CM, Phillips CD. Validation of the minimum data set cognitive performance scale: agreement with the mini-mental state examination. J Gerontol A Biol Sci Med Sci. 1995;50:M128–33.

    Article  CAS  PubMed  Google Scholar 

  23. Fries BE, Simon SE, Morris JN, Flodstrom C, Bookstein FL. Pain in U.S. nursing homes: validating a pain scale for the minimum data set. Gerontologist. 2001;41:173–9.

    Article  CAS  PubMed  Google Scholar 

  24. Burrows AB, Morris JN, Simon SE, Hirdes JP, Phillips C. Development of a minimum data set-based depression rating scale for use in nursing homes. Age Ageing. 2000;29:165–72.

    Article  CAS  PubMed  Google Scholar 

  25. Hutchinson AM, Milke DL, Maisey S, Johnson C, Squires JE, Teare G, et al. The resident assessment instrument-minimum data set 2.0 quality indicators: a systematic review. BMC Health Serv Res. 2010;10:166.

    Article  PubMed  PubMed Central  Google Scholar 

  26. Gambassi G, Landi F, Peng L, Brostrup-Jensen C, Calore K, Hiris J, et al. Validity of diagnostic and drug data in standardized nursing home resident assessments: potential for geriatric pharmacoepidemiology. Med Care. 1998;36:167–79.

    Article  CAS  PubMed  Google Scholar 

  27. Lix LM, Yan L, Blackburn D, Hu N, Schneider-Lindner V, Shevchuk Y, et al. Agreement between administrative data and the resident assessment instrument minimum dataset (RAI-MDS) for medication use in long-term care facilities: a population-based study. BMC Geriatr. 2015;15:24.

    Article  PubMed  PubMed Central  Google Scholar 

  28. Lix LM, Yan L, Blackburn D, Hu N, Schneider-Lindner V, Teare GF. Validity of the RAI-MDS for ascertaining diabetes and comorbid conditions in long-term care facility residents. BMC Health Serv Res. 2014;14:17.

    Article  PubMed  PubMed Central  Google Scholar 

  29. Wodchis WP, Naglie G, Teare GF. Validating diagnostic information on the minimum data set in Ontario hospital-based long-term care. Med Care. 2008;46:882–7.

    Article  PubMed  Google Scholar 

  30. Mor V, Intrator O, Unruh MA, Cai S. Temporal and geographic variation in the validity and internal consistency of the nursing home resident assessment minimum data set 2.0. BMC Health Serv Res. 2011;11:78.

    Article  PubMed  PubMed Central  Google Scholar 

  31. Doupe M, Finlayson G, Khan S, Yogendran M, McDougall C, Kulbaba C. Supportive Housing For seniors: reform implications for Manitoba’s older adult continuum of care. Winnipeg: Manitoba Centre for Health Policy; 2016. http://mchp-appserv.cpe.umanitoba.ca/reference//Supportive%20Housing_report_web.pdf. Accessed 6 Sep 2016

    Google Scholar 

  32. Canadian Institute for Health Information. CIHI Data Quality Study of the 2005–2006 Discharge Abstract Database. Ottawa; 2009. https://secure.cihi.ca/free_products/DAD_DQ_study_2005_2006_August_2009_e.pdf Accessed 11 Jul 2017.

  33. Doupe M, Chateau D, Derksen S, Sarkar J, Lobato de Faria R, S T, et al. Factors affecting emergency department waiting room times in Winnipeg. Winnipeg: Manitoba Centre for Health Policy; 2017. http://mchp-appserv.cpe.umanitoba.ca/reference//ER2011_report_Web.pdf Accessed 8 Apr 2017

    Google Scholar 

  34. Cohen S, Syme L. No TitleIssues in the study and application of social support. In: Cohen S, Syme L, editors. Soc. Support Heal. New York: Academic Press; 1985. p. 3–22.

    Google Scholar 

  35. Ene M, Leighton EA, Blue GL, Bell BA, University of South Carolina. Multilevel Models for Categorical Data Using SAS® PROC GLIMMIX: The Basics. 2014. https://support.sas.com/resources/papers/proceedings15/3430-2015.pdf Accessed 17 Jul 2017.

    Google Scholar 

  36. Doupe MB, Day S, McGregor MJ, John PS, Chateau D, Puchniak J, et al. Pressure ulcers among newly admitted nursing home residents: measuring the impact of transferring from hospital. Med Care. 2016;54:584–91.

    Article  PubMed  Google Scholar 

  37. Baumgarten M, Margolis D, Gruber-Baldini AL, Zimmerman S, German P, Hebel JR, et al. Pressure ulcers and the transition to long-term care. Adv Skin Wound Care. 2003;16:299–304.

    Article  PubMed  Google Scholar 

  38. Young Y. Factors associated with permanent transition from independent living to nursing home in a continuing care retirement community. J Am Med Dir Assoc. 2009;10(7):491.

    Article  PubMed  Google Scholar 

  39. Grabowski DC, Stevenson DG, Cornell PY. Assisted living expansion and the market for nursing home care. Health Serv Res. 2012;47:2296–315.

    Article  PubMed  PubMed Central  Google Scholar 

  40. Sivananthan SN, Doupe M, McGregor MJ. Exploring the ecology of Canada’s publicly funded residential long-term care bed supply. Can J Aging. 2015;34:60–74.

    Article  PubMed  Google Scholar 

  41. Gruneir A, Bell CM, Bronskill SE, Schull M, Anderson GM, Rochon PA. Frequency and pattern of emergency department visits by long-term care residents--a population-based study. J Am Geriatr Soc. 2010;58:510–7.

    Article  PubMed  Google Scholar 

  42. Menec VH, Nowicki S, Blandford A, Veselyuk D. Hospitalizations at the end of life among long-term care residents. J Gerontol A Biol Sci Med Sci. 2009;64:395–402.

    Article  PubMed  Google Scholar 

Download references

Acknowledgments

Data used in this study are from the Population Health Research Data Repository housed at the Manitoba Centre for Health Policy, University of Manitoba, and were provided by Manitoba Health and the Winnipeg Regional Health Authority. The results and conclusions are those of the authors. No official endorsement by the Manitoba Centre for Health Policy or other data providers is intended or should be inferred. We would like to thank Jessica Jarmasz for preparing this manuscript for submission.

Funding

This research was supported by a peer reviewed operating grant provided by Research Manitoba. This organization had no input into any of: i) the design of the study; ii) the collection, analysis, and interpretation of data, and; iii) writing of the manuscript.

Availability of data and materials

Data used in this article was derived from administrative health and social data as a secondary use. The data was provided under specific data sharing agreements only for approved use at Manitoba Centre for Health Policy (MCHP). The original source data is not owned by the researchers or MCHP and as such cannot be provided to a public repository. The original data source and approval for use has been noted in the ethics approval and acknowledgment sections of the article. Where necessary, source data specific to this article may be reviewed at MCHP with the consent of the original data providers, along with the required privacy and ethical review bodies.

Author information

Authors and Affiliations

Authors

Contributions

MBD conceived the project, developed and supervised the data analysis plan, and drafted the manuscript. LML assisted with drafting the manuscript. LML, JP, PGN, AG, and SZ critically reviewed the manuscript. ND conducted the data extraction and carried out the data analyses. All authors read and approved the final manuscript.

Corresponding author

Correspondence to Malcolm B. Doupe.

Ethics declarations

Ethics approval and consent to participate

Approval to conduct the research was provided by the University of Manitoba Health Research Ethics Board (file #: HS14151) and by the Provincial Health Information Privacy Committee (file #: 2011/2012–26).

Competing interests

The authors declare that they have no competing interests.

Publisher’s Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Additional file

Additional file 1:

Study Outcome Contingency Tables. Data compare the counts of different nursing homes users (e.g., newly admitted residents, those who transferred between nursing homes, people who died) and measures of healthcare use (e.g., hospital inpatient visits, emergency department visits, physician examinations) between the Minimum Data Set system and administrative files. (DOCX 23 kb)

Rights and permissions

Open Access This article is distributed under the terms of the Creative Commons Attribution 4.0 International License (http://creativecommons.org/licenses/by/4.0/), which permits unrestricted use, distribution, and reproduction in any medium, provided you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons license, and indicate if changes were made. The Creative Commons Public Domain Dedication waiver (http://creativecommons.org/publicdomain/zero/1.0/) applies to the data made available in this article, unless otherwise stated.

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Doupe, M.B., Poss, J., Norton, P.G. et al. How well does the minimum data set measure healthcare use? a validation study. BMC Health Serv Res 18, 279 (2018). https://doi.org/10.1186/s12913-018-3089-7

Download citation

  • Received:

  • Accepted:

  • Published:

  • DOI: https://doi.org/10.1186/s12913-018-3089-7

Keywords