Skip to main content

From theory to practice: improving the impact of health services research



While significant strides have been made in health research, the incorporation of research evidence into healthcare decision-making has been marginal. The purpose of this paper is to provide an overview of how the utility of health services research can be improved through the use of theory. Integrating theory into health services research can improve research methodology and encourage stronger collaboration with decision-makers.


Recognizing the importance of theory calls for new expectations in the practice of health services research. These include: the formation of interdisciplinary research teams; broadening the training for those who will practice health services research; and supportive organizational conditions that promote collaboration between researchers and decision makers. Further, funding bodies can provide a significant role in guiding and supporting the use of theory in the practice of health services research.


Institutions and researchers should incorporate the use of theory if health services research is to fulfill its potential for improving the delivery of health care.

Peer Review reports


While significant strides have been made in medical research over the past several decades, many research results considered important by researchers and expert committees are not being used by health care practitioners. While the value of health services research must be judged by its validity, its utility cannot be taken for granted. There has been an assumption that when research information is available it will be accessed, appraised and then applied [1]. However, knowledge of a research-based recommendation is by itself insufficient to ensure its adoption. While the value of research evidence as a basis for decision making in health care is well established, the incorporation of such evidence into decision-making remains inconsistent [2].

The gap between research evidence and its' incorporation into practice has led to an increase in research in how to bring new knowledge to bear on everyday health care. Factors influencing the adoption of research evidence have been studied extensively [35]. Personal attributes, time, organizational boundaries, geography and educational background all contribute to decision-makers' responses to research evidence [6, 7]. An area that has received less attention is the incorporation of theory in health services research. The authors of this paper propose a need for a stronger theoretical base in health services research wherein health services research would be more informative and influential, facilitating the adoption of research results into practise. Integrating theory into health services research is an important first step. In this paper we first describe the importance of theory followed by how theory driven research changes the manner researchers interact with decision-makers. We conclude on how theory driven research may influence the training, practice and the funding of health services research.


The importance of theory

In recent years a number of researchers have advocated a greater role for the use of theory in strengthening the practice of research [812]. However, health services research has continued to focus primarily on evaluating outcomes with less attention to the mechanisms by which these outcomes are produced [10, 13, 14]. The emphasis on method at the expense of theory has led to several criticisms. Chen and Rossi [15] argue that an atheoretical approach to research is characterized by adherence to a step-by-step cookbook method for doing outcomes studies. In this situation they contend that research is reduced to a set of predetermined steps that are mechanically applied to various interventions without concern for the theoretical implications of intervention content, setting, participants or implementing organizations. The atheoretical approach tends to result in a simple input/output, or black box type of study [13]. Such simple evaluations may provide a gross assessment of whether or not an intervention works under one set of conditions but fail to identify the reasons why. As such, the conclusions are often less than satisfying to consumers of research results and not easily transferable to different settings.

Theory provides a systematic view of a phenomena by specifying the relations among variables and propositions with the purpose to explain or predict phenomena that occurs in the world [16, 17]. In health services research theory can provide a framework to understand the relationship between program inputs (resources), program activities (how the program is implemented) and their outputs or outcomes [11, 13]. In addition to identifying the mechanisms by which programs are effective, theory may consider program implementation and contextual factors. While it is important to know the extent to which an intervention attains intended outcomes, it is also essential to know what occurred in the implementation of the intervention. Variation in the implementation of the intervention may be due to differences among program providers, target population characteristics, and differences among sites on how the intervention is delivered. Theory also offers the opportunity to specify the contextual conditions that will influence the effectiveness of an intervention. Attitudinal factors at the provider level as well as structural, cultural factors at the organizational level have been under appreciated in exploring variations in health care outcomes [9, 18, 19]. Understanding the influence that contextual factors have on program implementation and outcomes facilitates successful application of the intervention in alternate settings, therein, addressing the generalizabilty of an intervention.

Theory offers many advantages to the health services researcher. Theory helps to identify the appropriate study question and target group; clarify methods and measurement issues; provide more detailed and informative descriptions on characteristics of the intervention and supportive implementation conditions; uncover unintended effects; assist in analysis and interpretation of results; and, the successful application of an intervention to different settings [11, 12].

Theory-driven studies are addressing the challenge of both decision-makers and funding agencies to move beyond simplistic explanations of significance in health services research. Decision-makers are seeking explanations about how an intervention works and whether it will work in a fashion similar to the intervention that was evaluated when applied to a different environment [10, 12, 20].

Despite these potential benefits, there are a number of reasons offered as to why there has been a failure to integrate theory into research. Ironically, clinical randomized control trials have discouraged the use of theory in health services research. Given the genesis of clinical trial methodology, this may derive, in part, from the very origins of epidemiology, whereby John Snow allegedly ended an epidemic of cholera by removing the handle from the Broad Street water pump, even though he had no concept of what actually caused cholera. By ignoring the need for theory, Snow was able to overcome the fact that the theories he would have needed had not yet been elucidated. Similarly, we know that lung cancer incidence can be reduced by elimination of cigarette smoking, even though we do not know exactly how cigarette smoke causes lung cancer. Experimental trials often determine intervention effects without considering how the component features of an intervention work together to bring about study outcomes [13, 15, 21]. The more complex the intervention, the more difficult it is to know what the treatment entailed. There is a growing recognition for the need to establish the theoretical bases of interventions. The United Kingdom Medical Research Council recently proposed a framework for the development and evaluation of randomized control trials for complex interventions where theory is viewed as valuable in assisting hypothesis development and steering decisions on strategic design issues [22].

Adopting a theory-driven approach in health services research is not without its challenges. Given the typical training of researchers and the uni-disciplinary nature of the practice the first challenge is the capacity of researchers to engage in theory driven research. Second, a theory driven approach requires organizational conditions that support researchers and decision makers collaborating in the development and testing of theory. Finally, theory development and testing is cumulative in nature, encouraging researchers to pursue a programmatic approach in research. This approach has implications on how funding agencies support health services research. Despite the potential challenges, a theory based approach offers promise for a greater understanding on what happens when interventions work to address social/health problems.

The importance of collaboration with decision-makers

Collaborative research partnerships between academic researchers and decision-makers describe a relationship and process between individuals from different backgrounds, who together, develop an integrative cooperative approach to resolve a research problem [23]. It has been identified as a significant strategy that holds multiple benefits [2327].

Collaborative practice has also been identified as a key strategy in facilitating a theory driven approach. Weiss [28] recommends that the first criterion in selecting a theory to guide the evaluation of a program is to draw the theory out from those associated from the program, including designers of the program, program personnel and relevant clinical staff. The argument is that few programs are theory driven. Rather, they are typically the product of the experience and values of those who are associated with the program. In recent years a number of techniques have been developed for this purpose. Strategies range from unstructured interviews, to highly structured iterative interactions between program personnel and researchers [2931]. Perspectives of service providers can be rounded out by a review of the research literature. In fact, a number of researchers suggest a combination of these two approaches [10, 32].

Viewing program stakeholders as a key source in developing theory in health services research demands stronger collaboration between researchers and program decision makers [9]. In this fashion, collaborative practice becomes a methodological strategy in health services research. Lomas [7] has stressed that a first step in encouraging meaningful partnerships between researchers and decision-makers is to view linkage and exchange between the two as a process not as a discrete event. Establishing and maintaining ongoing links offers a more comprehensive understanding between the two groups. Researchers uncover the desired program outcomes, the causal change of the program intervention and develop a better understanding of the contextual factors that influence the variation on intervention implementation and outcomes. Similarly, decision-makers will develop a deeper understanding of the research process and thus can influence the development of feasible and sustainable interventions for practice settings.

The role and impact of the researcher and the research process in practice settings have received greater attention in other fields such as program evaluation, nursing, anthropology and community psychology. For example, core principles of community psychology practice include: a) consistency of goals and values between the researcher and the setting, and b) the notion that interventions should have the potential for being "institutionalized" or systematically established within the setting in such a way that strengthens the natural resources of the setting [2527]. Rather than reinventing the wheel, health services research could benefit from theoretical frameworks developed within these disciplines.

Implications for the practice of health services research

Recognizing the importance of theory calls for new expectations in the practice of health services research. There are a number of challenges that must be met in order for these perspectives to gain acceptance in the health services research community.

Evolving perspectives on the practice of health services research require recognition that few disciplines are able to span the breadth of responsibilities associated with the research process. To date there has been a tendency for health services research to be practiced as a uni-discipline where clinical disciplines tend to practice separately from the social science disciplines. A priority is to encourage the formation of research teams that are inter-disciplinary. Pursuing this agenda will promote the formation of research teams that may include: business, anthropology, sociology, psychology, education, engineering, nursing and medicine. Combined disciplinary skills would, in a complementary fashion, address the breadth of skills required in a more complex research environment that includes the development and testing of theory.

A second point concerns broadening the training for those who will practice health services research. By and large, academic training has focused on methodological issues. While a focus on research methods has made an important contribution to the practice of health services research, relying on research methods as a core curriculum has led to limitations in the training of health services researchers such as inadequate attention to the value of theory driven research. As health services research expands its methodological repertoire beyond the classical randomized control trial, researchers face increased ambiguity in attributing the source of intervention impact. It is in this circumstance that theory can guide health services researchers in understanding the causal linkages within an intervention. Further, students are educated in separate departments with little planned, formal activity across disciplines, which discourages co-operative approaches to research and service [3335]. Education programs are not generally structured to facilitate the importance of inter-disciplinary strategies. Identifying the processes associated with creating effective linkages between researchers and decision-makers are also not typically part of training. Rethinking the current assumptions and practices regarding the training of health services researchers will enable trainees in health services research to be better prepared for their evolving responsibilities.

Collaboration between researchers and decision-makers are contingent upon supportive organizational conditions for both partners. Researchers have, and most likely will continue to operate from university-based settings where incentives for promotion and tenure can act as barriers to changes in the practice of health services research [7, 36]. Most academic institutions award tenure and promote faculty based upon the frequency and quality of publications and on obtaining peer review funding [36]. The time involved in collaborating with decision-makers, joint planning and implementing research often represents activities that are not recognized by tenure promotion committees. As well, these activities may slow the production of research results and the generation of publications. Recognizing these factors requires academic centres to generate new criteria for evaluating contributions to knowledge and practice. Decision-making organizations also play a significant role in ensuring the success of collaborative relationships. The clearest indication of institutional support for research is to provide the time and resources for decision-makers to participate in collaboration activities with researchers.

Funding bodies have the potential to play a significant role in guiding and integrating these considerations into health services research. Research sponsors can develop evaluation criteria that encourage the application of theory. As an example, the Agency for Healthcare Research and Quality (AHRQ) in the USA funded an initiative (Translating Research into Practice) to identify sustainable and reproducible strategies that will: 1) accelerate the impact of health services research on direct patient care; and 2) improve the outcomes, quality, effectiveness, efficiency, and/or cost effectiveness of care through partnerships between health care organizations and researchers [37]. Further, research sponsors are beginning to move away from supporting single shot studies that are conducted in relative independence from one another. This focus on supporting programmatic research should be encouraged. Programmatic research offers a cumulative environment that allows researchers the opportunity to develop and test the application of theory. In a similar fashion, collaborative practice is also best practiced in a programmatic environment. Developing and maintaining linkages with decision-makers is predicated on developing and maintaining long-term relations. Embedded within these linkages are fundamental professional and personal attributes that include; credibility, familiarity, mutual understanding and trust [38, 39].


This paper has examined the importance of theory in health services research. We have argued that by strengthening the role of theory encourages collaborative practice between researchers and decision-makers. It has been noted that a theory driven approach in health services research is not without its challenges. However, given the modest advances towards incorporating research evidence into healthcare decisions, a theory driven approach is well worth the effort. The implication of this approach for health services research is that it has impact on the training and practice of health services research. Institutions and researchers should consider this emerging model of practice if health services research is to fulfill its potential for improving the delivery of care.


  1. 1.

    Lenfant C: Clinical research to clinical practice – lost in translation. N Engl J Med. 2003, 349: 868-874. 10.1056/NEJMsa035507.

    Article  PubMed  Google Scholar 

  2. 2.

    Oxman A, Thompson M, Davis D, Haynes B: No magic bullets: A systematic review of 102 trials of interventions to improve professional practice. CMAJ. 1995, 153: 1423-1431.

    CAS  PubMed  PubMed Central  Google Scholar 

  3. 3.

    Cabana M, Rand C, Powe N, Wu M, Abbound P, Rubin H: Why don't physicians follow clinical practice guidelines? A framework for improvement. JAMA. 1999, 282: 1458-1465. 10.1001/jama.282.15.1458.

    CAS  Article  PubMed  Google Scholar 

  4. 4.

    Mittman B, Tonesk X, Jacobson P: Implementing clinical practice guidelines: Social influence strategies and practitioner behavior change. QRB Qual Rev Bull. 1992, 18: 413-422.

    CAS  PubMed  Google Scholar 

  5. 5.

    Lomas J: Retailing research: Increasing the role of evidence in clinical services for childbirth. Milbank Q. 1993, 71: 439-1373.

    CAS  Article  PubMed  Google Scholar 

  6. 6.

    Funk S, Champagne M, Tornquist E, Wiese R: Administrator's views on barriers to research utilization. ANR. 1995, 8: 44-49. 10.1016/S0897-1897(95)80331-9.

    CAS  PubMed  Google Scholar 

  7. 7.

    Lomas J: Improving research dissemination and uptake in the health sector: Beyond the sound of one hand clapping. Health Policy Commentary Series. 1997, Hamilton, Centre for Health Economics and Policy Analysis, McMaster University, . No C9701

    Google Scholar 

  8. 8.

    Grimshaw J, Eccles M, Walker A: Changing physicians' behavior: What works and thoughts on getting more things to work. J Contin Educ Health Prof. 2002, 22: 237-243.

    Article  PubMed  Google Scholar 

  9. 9.

    Foy R, Eccles M, Grimshaw J: Why does primary care need more implementation research?. Fam Prac. 2001, 18: 353-355. 10.1093/fampra/18.4.353.

    CAS  Article  Google Scholar 

  10. 10.

    Chen H: Theory-driven evaluations. 1990, Newbury Park: Sage Publications

    Google Scholar 

  11. 11.

    Weiss C: Evaluation methods for studying programs and policies. 1998, New Jersey: Prentice-Hall

    Google Scholar 

  12. 12.

    Sidani S, Braden C: Evaluating nursing interventions: A theory-driven approach. 1998, Newbury Park: Sage Publications

    Google Scholar 

  13. 13.

    Lipsey M: Theory as method: Small theories of treatments. In: Research methodology: Strengthening causal interpretations of nonexperimental data: Conference proceedings. Edited by: Sechrest L, Perrin E, Bunker J. 1990, Washington: Agency for Health Care Policy and Research, 33-51.

    Google Scholar 

  14. 14.

    DeFriese G: Theory as method. In: Research methodology: Strengthening causal interpretations of nonexperimental data: Conference proceedings. Edited by: Sechrest L, Perrin E, Bunker J. 1990, Washington: Agency for Health Care Policy and Research, 53-55.

    Google Scholar 

  15. 15.

    Chen H, Rossi P: Evaluating with sense: The theory-driven approach. Eval Rev. 1983, 7: 283-302.

    Article  Google Scholar 

  16. 16.

    Kerlinger F: Foundations of Behavioral Research. 1964, New York, Holt, Rinehart and Winston, Inc

    Google Scholar 

  17. 17.

    Creswell J: Research Design: Qualitative, Quantitative, and Mixed Methods Approaches. 2003, Thousand Oaks, Sage Publications

    Google Scholar 

  18. 18.

    Sheldon T: It ain't what you do but the way that you do it. J Health Serv Res Policy. 2001, 6: 3-5. 10.1258/1355819011927125.

    CAS  Article  PubMed  Google Scholar 

  19. 19.

    Ferlie E, Shortell S: Improving the quality of health care in the United Kingdom and the United States: A framework for change. Milbank Q. 2001, 79: 281-315. 10.1111/1468-0009.00206.

    CAS  Article  PubMed  PubMed Central  Google Scholar 

  20. 20.

    Wholey J: Assessing the Feasibility and Likely Usefulness of Evaluation. In: Handbook of practical program evaluation. Edited by: Wholey J, Hatry H, Newcomer K. 1996, San Francisco: Jossey-Bass, 15-39.

    Google Scholar 

  21. 21.

    Cole G: Advancing the development and application of theory-based evaluation in the practice of public health. Am J Eval. 1999, 20: 453-470. 10.1016/S1098-2140(99)00033-8.

    Article  Google Scholar 

  22. 22.

    Medical Research Council: A Framework for Development and Evaluation of RCTs for Complex Interventions to Improve Health. MRC Health Services and Public Health Board discussion document. 2000

    Google Scholar 

  23. 23.

    Gitlin L, Lyons K, Kolodner E: A model to build collaborative research or educational teams of health professionals in gerontology. Educ Gerontol. 1994, 2015-34.

    Google Scholar 

  24. 24.

    Kitson A, Ahmed L, Harvey G, Seers K, Thompson D: From research to practice: One organizational model for promoting research-based practice. J Adv Nurs. 1996, 23: 430-440.

    CAS  Article  PubMed  Google Scholar 

  25. 25.

    Israel B, Schulz A, Parker E, Becker A: Review of community-based research: Assessing partnership to improve public health. Annu Rev Public Health. 1998, 19: 173-202. 10.1146/annurev.publhealth.19.1.173.

    CAS  Article  PubMed  Google Scholar 

  26. 26.

    Roussos S, Fawcett S: A review of collaborative partnerships as a strategy for improving community health. Annu Rev Public Health. 2000, 21: 369-402. 10.1146/annurev.publhealth.21.1.369.

    CAS  Article  PubMed  Google Scholar 

  27. 27.

    Levine M, Perkins D: Principles of Community Psychology: Perspectives and Applications. 1987, New York: Oxford University Press

    Google Scholar 

  28. 28.

    Weiss C: Which links in which theories shall we evaluate?. In. Program theory in evaluation: Challenges and opportunities. New Directions for Program Evaluation, no 87. Edited by: Rogers P, Hacsi T, Petrosino A, Huebner T. 2000, San Francisco: Jossey-Bass, 35-45.

    Google Scholar 

  29. 29.

    Huebner T: Theory-based evaluation: Gaining a shared understanding between school staff and evaluators. In. Program theory in evaluation: Challenges and opportunities. New Directions for Program Evaluation, no 87. Edited by: Rogers P, Hacsi T, Petrosino A, Huebner T. 2000, San Francisco: Jossey-Bass, 79-89.

    Google Scholar 

  30. 30.

    Lipsey M, Pollard J: Driving toward theory in program evaluation: More models to choose from. Eval Program Plann. 1989, 12: 317-328. 10.1016/0149-7189(89)90048-7.

    Article  Google Scholar 

  31. 31.

    Patton M: Utilization-Focused Evaluation. 1996, Thousand Oaks, Calif: Sage, 3

    Google Scholar 

  32. 32.

    Pawson R, Tiley N: Realistic Evaluation. 1995, London: Sage

    Google Scholar 

  33. 33.

    Colloton J: Academic medicine's changing covenant with society. Acad Med. 1989, 64: 55-60.

    CAS  Article  PubMed  Google Scholar 

  34. 34.

    Pew Health Professions Commission: Critical Cchallenges: Revitalizing the Health Professions for the Twenty-first century. 1995, San Francisco: UCSF: The Center for Health Professions

    Google Scholar 

  35. 35.

    MacLeod S, McCullough H: Social science education as a component of medical training. Soc Sci Med. 1994, 39: 1367-1373. 10.1016/0277-9536(94)90367-0.

    CAS  Article  PubMed  Google Scholar 

  36. 36.

    Pranulis M: Research programs in a clinical setting. West J Nurs Res. 1991, 13: 274-277.

    CAS  Article  PubMed  Google Scholar 

  37. 37.

    Farquhar C, Stryer D, Slutsky J: Translating research into practice: The future ahead. Int J Qual Health Care. 2002, 14: 233-249.

    Article  PubMed  Google Scholar 

  38. 38.

    Baker E, Homan S, Schonoff R, Kreuter M: Principles of practice for academic/practice/community research partnerships. Am J Prev Med. 1999, 703: 74-85.

    Google Scholar 

  39. 39.

    LeGris J, Weir R, Browne G, Gafni A, Stewart L, Easton SL: Developing a model of collaborative research: The complexities and challenges of implementation. Int J Nurs Stud. 2000, 37: 65-79. 10.1016/S0020-7489(99)00036-X.

    CAS  Article  PubMed  Google Scholar 

Pre-publication history

  1. The pre-publication history for this paper can be accessed here:

Download references

Author information



Corresponding author

Correspondence to Kevin Brazil.

Additional information

Competing interests

The authors declare that there are no competing interests. Disclaimer: The opinions expressed are the authors' and do not necessarily represent official policy of AHRQ or the Department of Health and Human Services

Authors' contributions

KB drafted the manuscript, edited and revised the contents, EO edited and revised the manuscript, KB, EO, MC, RS, DS all contributed to the conceptual development, editing and review of the manuscript.

Rights and permissions

Reprints and Permissions

About this article

Cite this article

Brazil, K., Ozer, E., Cloutier, M.M. et al. From theory to practice: improving the impact of health services research. BMC Health Serv Res 5, 1 (2005).

Download citation


  • Health Service Research
  • Community Psychology
  • Collaborative Practice
  • Health Service Researcher
  • Program Personnel