Skip to main content
  • Research article
  • Open access
  • Published:

Individual, institutional, and scientific environment factors associated with questionable research practices in the reporting of messages and conclusions in scientific health services research publications

Abstract

Background

Health Services Research findings (HSR) reported in scientific publications may become part of the decision-making process on healthcare. This study aimed to explore associations between researcher’s individual, institutional, and scientific environment factors and the occurrence of questionable research practices (QRPs) in the reporting of messages and conclusions in scientific HSR publications.

Methods

We employed a mixed-methods study design. We identified factors possibly contributing to QRPs in the reporting of messages and conclusions through a literature review, 14 semi-structured interviews with HSR institutional leaders, and 13 focus-groups amongst researchers. A survey corresponding with these factors was developed and shared with 172 authors of 116 scientific HSR publications produced by Dutch research institutes in 2016. We assessed the included publications for the occurrence of QRPs. An exploratory factor analysis was conducted to identify factors within individual, institutional, and environmental domains. Next, we conducted bivariate analyses using simple Poisson regression to explore factors’ association with the number of QRPs in the assessed HSR publications. Factors related to QRPs with a p-value < .30 were included in four multivariate models tested through a multiple Poisson regression.

Results

In total, 78 (45%) participants completed the survey (51.3% first authors and 48.7% last authors). Twelve factors were included in the multivariate analyses. In all four multivariate models, a higher score of “pressure to create societal impact” (Exp B = 1.28, 95% CI [1.11, 1.47]), was associated with higher number of QRPs. Higher scores on “specific training” (Exp B = 0.85, 95% CI [0.77–0.94]) and “co-author conflict of interest” (Exp B = 0.85, 95% CI [0.75–0.97]) factors were associated with a lower number of QRPs. Stratification between first and last authors indicated different factors were related to the occurrence of QRPs for these groups.

Conclusion

Experienced pressure to create societal impact is associated with more QRPs in the reporting of messages and conclusions in HSR publications. Specific training in reporting messages and conclusions and awareness of co-author conflict of interests are related to fewer QRPs. Our results should stimulate awareness within the field of HSR internationally on opportunities to better support reporting in scientific HSR publications.

Peer Review reports

Background

In 2009, it was estimated that 85% of research funding in biomedical sciences was avoidably wasted [1]. In the biomedical sciences, evidence has been piling up on questionable research practices (QRPs) such as imbalanced research question selection, poor study design and execution, non-publication, and poor reporting [1]. Over time, advancements have been made to address these QRPs, including scientific reporting [2]. However, proper interpretation and reporting of messages and conclusions across different research methodologies in scientific publications requires more attention [3]. Researchers can introduce various QRPs in the reporting of messages and conclusions in their scientific publications (e.g., generalizing findings to populations not included in the study, not reporting contradictory evidence, claiming an unjustified causal relationship, and inadequately justifying conclusions) [3,4,5]. Moreover, although scientific reporting of biomedical studies is progressing [2], responsible scientific reporting requires greater awareness.

In this study we focus on the field of Health Services Research (HSR). HSR has a direct link to policy and practice, where stakeholders and funders may contribute considerably to the interpretation of results [6, 7]. Additionally, HSR relies on mixed methodologies such as qualitative and mixed methods designs that may have less strict reporting requirements compared to quantitative designs such as randomized controlled trials [3].

Recent work has suggested that scientific HSR publications may include a median of six QRPs in the reporting of messages and conclusions [8]. QRPs were primarily found in reported implications and recommendations for policy and practice, in a lack of mentioning of contradictory evidence, and in the conclusions of the scientific publication [8]. The occurrence of these QRPs is concerning as messages and conclusions reported in scientific HSR literature are often transferred to policy makers, managers, and the general public. Further, these groups may learn about messages and conclusions directly from the scientific publication or through societal publications such as professional journals, factsheets, press releases, and reports [7, 9,10,11,12]. Whether messages are disseminated by researchers, science communicators, or journalists, they may be accepted as established evidence and become part of the decision-making process on health and healthcare. Decisions on topics such as co-payments, adaption of protocols in hospitals, admitting medications to insurance packages, and tobacco regulation may thus be affected by inadequately reported messages and conclusions [9, 13, 14].

Scientific journals have taken the lead in implementing control measures to provide structure to the review process and improve responsible reporting [9]. These efforts have resulted in practices such as publication checklists [15], data sharing, open access [16] and public peer-review [17] becoming increasingly common. Yet, these measures are primarily aimed at increasing transparency in reporting and thus may be insufficient in preventing QRPs in the reporting of messages and conclusions specifically. To strengthen the reporting of messages and conclusions, measures may need to be taken at multiple levels, including academic journals and research institutions themselves [18]. QRPs may be caused by adverse incentives on institutional level, such as an inadequate reward system, lacking reporting infrastructures, and insufficient prepublication review [1]. Recently, Dutch academic and non-academic HSR institutions have begun to collaborate with the goal to increase responsible reporting of HSR findings. Non-academic institutions are research institutions independent of universities. These efforts have been supported by the Netherlands Organization for Health Research and Development (ZonMw).

HSR institutions in the Netherlands have varying organizational policies in fostering responsible conduct of research, including responsible reporting. This variety in institutional culture and organisation offers the opportunity to learn from each other’s reporting practices. Improving scientific publication of HSR requires an understanding of factors that influence authors in their writing, as well as those that impact the publication process itself (e.g. pressure and relationships with funders) [19,20,21]. Research institutions may prevent the occurrence of QRPs by improving internal integrity and training researchers in scientific writing and communication [19,20,21]. However, considering the specific characteristics of HSR, additional evidence is needed on how possible factors may relate to QRPs in messages and conclusions specifically [22].

Consequently, the aim of this study was to explore associations between individual, institutional, and scientific environment factors and the frequency of inadequacies in the reporting of messages and conclusions in scientific HSR publications.

Methods

Design

We employed a mixed-methods study design. First, we identified factors possibly contributing to the occurrence of QRPs in the reporting of messages and conclusions in scientific HSR publications through a literature review, 14 semi-structured interviews with leaders of HSR groups or institutions in the Netherlands, and 13 focus-groups amongst junior health services researchers. Factors were clustered into three domains: individual, institutional and scientific environmental domains [9]. Second, a survey corresponding to the identified factors was developed and shared with 172 first and last authors of a sample of 116 scientific HSR publications published in 2016 with an affiliation to Dutch HSR groups or institutions.

Setting

The study involved publications and participants from 13 HSR groups, departments, or institutions including both academic and non-academic institutions (hereafter referred to as “HSR institutions”) in the Netherlands. These institutions agreed to participate in an effort to assure the overall quality of HSR publications in the Netherlands.

Conceptual framework on factors potentially associated with QRPs in the reporting of messages and conclusions in HSR

Factors potentially associated with QRPs in HSR were identified through an exploratory literature review, 14 semi-structured interviews with 19 leaders/representatives of the 13 participating institutions, and 13 focus-groups comprised of 57 junior/PhD researchers at participating HSR institutions. An initial overview of factors was created through the literature review. This overview was then discussed in the semi-structured interviews with the leaders/representatives of participating institutions. Within the focus-groups, an open conversation was held with participants to identify additional factors overlooked in the literature and interviews (a focus group guide is provided as supplementary material 1). Documented interview reports and transcripts were qualitatively analysed in MaxQDA resulting in the specification of factors potentially associated with QRPs in the reporting of messages and conclusions in scientific HSR publications. The applied methods for the development of these factors are described in more detail in supplementary material 2.

Identified factors were included in a theoretical framework consisting of three domains: individual, institutional, and scientific environments. Of note, factors within each domain could be influenced by those in other domains. The individual domain was comprised of factors bound to the individual researcher, including those associated with research experience and self-efficacy. The institutional domain included factors controlled by the institution that houses the researcher. These included institutional culture, facilities, interactions, and policies that may affect the writing and publication experience of the researcher. More concretely, an institution may have an (unofficial) policy to produce a certain number of publications per year. The scientific environmental domain included factors that manifest outside of the direct control of the institution, including those characterizing scientific culture and systems in general.

The framework of included factors is provided in Table 1.

Table 1 Framework and included factors

Survey development

The survey was designed based on the framework described above. For each identified factor in the preliminary framework, one or more survey questions were developed. Questions were evaluated on their face validity by the co-authors and two project advisors, both senior health services researchers. For this study, we developed a new questionnaire, as no validated questionnaires were tailored to the field of HSR or scientific reporting. One existing question from the publication pressure questionnaire was included in our newly developed questionnaire [21]. The questionnaire was developed in English (i.e, the primary working language of the study population).

A “think out loud” test was performed with two people from the target population. RG sat down with two researchers individually as they answered the survey questions and commented on their interpretation. The survey was designed in English and checked by a native speaker. After the final revision, the survey included 97 questions related to the factors within individual, institutional and scientific environment domains. Seven additional questions were included to assess personal and background characteristics. Answers to survey questions were provided on a Likert-scale (strongly disagree, disagree, neither agree nor disagree, agree, and strongly agree).

The survey is provided in supplementary material 3.

Survey study population

In a previous study we assessed QRPs in the reporting of messages and conclusions in 116 international peer-reviewed publications authored by researchers from the 13 participating institutions. QRPs were defined as “to report, either intentionally or unintentionally, conclusions or messages that may lead to incorrect inferences and do not accurately reflect the objectives, the methodology, or the results of the study.” [8] For the assessment, we used a detailed assessment form including 35 possible QRPs in reporting messages and conclusions (e.g., “conclusions that do not adequately reflect the findings of the study”, “limitations are not adequately justified”). This assessment form, along with corresponding methods and results, have been published elsewhere [6]. For the current study, we conducted a survey amongst the 172 first and last authors of these publications.

First and last authors of the 116 scientific publications were included in our assessment.

We identified a total of 202 authors (116 first authors and 86 unique last authors) as the sample for our study. Contact information (i.e., e-mail addresses) was obtained through the participating institutions. These institutions were asked to encourage their researchers to participate in the survey, however, participation was voluntary and participants could stop at any time. Participants of the survey were informed of the goal of the study and data handling procedures in the invitation e-mail and at the start of the survey. We excluded 30 authors who’s contact information was unknown, resulting in a final sample of 172 authors. The response rate was 45% (78 respondents).

Quantitative analysis

Dependent variable

The main dependent variable of this study was the number of QRPs in the reporting of messages and conclusions in HSR publications. QRP’s were identified by RG, JM, and a third assessor. Two reviewers independently assed the HSR publications. Absence and presence of QRP’s was discussed for all publications and identified through mutual consensus. The data were obtained from one of our previous studies, that provides more details on the applied methodology for identifying QRPs [8].

Independent variables

The items (i.e., questions) included in the survey questionnaire derived from three major domains as described above (i.e., the individual, institutional, and scientific environment). An exploratory factor analysis was conducted to identify factors underlying items within each domain. The factors identified in the exploratory analyses were named as much as possible in alignment with the factors in our theoretical framework.

We used the factors identified from the factor analysis as independent variables in our analyses. The methods and results of the factor analysis are further described in supplementary material 4.

Considering the explorative nature of our study, no assumptions were made regarding the relative importance of factors within and between domains. Due to our explorative aim, we decided to include all factors resulting from the factor analyses.

Other characteristics

We collected several personal characteristics in the survey (i.e., age, working experience as scientific researcher, academic background, academic position, number of publications co-authored, and journal’s impact factor). We described sample characteristics based on these variables.

Statistical analysis

The basic characteristics of study samples were described based on the measurement scale of the variables. Categorical variables (nominal / ordinal) were presented as frequency and percentage, whereas numerical variables (interval/ratio) were presented using mean and standard deviation.

We conducted bivariate analysis using simple Poisson regression. Poisson regression was chosen considering the nature of the outcome (number of QRPs) as count data with a relatively small mean value. This analysis specifically assessed the association between each factor score and the number of QRPs in HSR publications. The analysis was also intended to reduce the number of factors which were included in the multivariate model.

Following the bivariate analysis, we applied multiple Poisson regression to further assess the association between the factor domains and the number of QRPs in HSR publications. For the purpose of model development, we provided four models in our multivariate analysis to ensure the stability of our results. The first two models were crude models (unadjusted) and included 12 factors from the bivariate analyses that exhibited a significant association with QRPs (ps < .3). The last two models included number of years of work experience as scientific researcher, as well as the journal’s impact factor, to examine the influence of these variables’ influence on the quality of reporting. For easier interpretation, we provided the coefficient of each explanatory variable (B), exponential form of the coefficient (Exp B) and the 95% confidence interval (95% CI). The goodness of fit of all models was checked using the chi-square goodness of fit test as part of the Poisson regression procedure, with results suggesting all models demonstrated good fit.

Considering that the factors we used as independent variables may be interrelated, we checked for collinearity in our regression model. Results from the correlation matrix in the exploratory factor analysis procedure showed 35 of 171 pairs between scales (20%) were significantly but not strongly correlated (rs < 0.3). Hence, these findings suggested no multi-collinearity issues in our analysis.

Because first and last authors have different roles in the writing of scientific publications, we provided additional stratified analysis between first and last authors to further explore the nature of the association between these factors and the number QRPs. All analyses were conducted using IBM SPSS version 25.

Ethics approval

A waiver for ethical approval was obtained for this study from the medical ethics review committee at Amsterdam UMC. To avoid negative consequences for participants, each participant and publication was assigned a unique identification number. Extracted data were entered in SPSS using this number to separate author information from the study data.

Results

Of the survey participants, 51.3% were a first author and 48.7% were a last author. PhD students (25.6%) and professors (29.5%) were the most frequent academic positions in our sample. First authors were predominantly PhD students (50.0%), whereas the last authors were predominately professors (57.9%). Both first (40%) and last (28.9%) authors primarily had an academic background in the social sciences (40%). Last authors were older, had longer working experience as a scientific researcher, and reported a larger number of publications co-authored as compared to first authors. The journal impact factor of the publications was similar between last and first authors. The number of QRPs per publication was slightly higher for the last authors than first authors. The basic characteristics of the study sample are provided in Table 2. There are 12 publications corresponding to both first and last author, 28 publications correspond only to a first author, and 26 publications correspond only to a last author.

Table 2 Basic characteristics of survey respondents

Bivariate analyses

Table 3 depicts findings from bivariate analyses examining the relationship between each factor from the individual, institutional, and scientific environment domain and the number of QRPs. Of the five factors in the individual domain, “pressure to create societal impact” (Exp B = 1.34, 95% CI [1.18, 1.51]) and “self-efficacy” (Exp B = 0.84, 95% CI [0.72, 0.98]) exhibited significant associations with the number of QRPs. For institutional factors, only “specific training in reporting messages and conclusions” (Exp B = 0.85, 95% CI [0.77, 0.93]) exhibited a significant association with the number of QRPs. Stakeholder influence (Exp B = 1.16, 95% CI [1.06, 1.27]) was the only factor from the scientific environmental domain that exhibited a significant association with the number of QRPs.

Table 3 Results of bivariate Poisson regression analysis examining individual, institutional, and scientific environment domain factors’ associations with number of QRPs

Multivariate analyses

Results of multivariate analyses are presented in Table 4. Of the four models in our analysis, three factors i.e., “pressure to create societal impact”, “specific training”, and “co-author conflict of interest” consistently exhibited a significant association with the number of QRPs.

Table 4 Multivariate analysis between factors from individual, institutional, and scientific environment domains with number of QRPs using Poisson regression

In the fully-adjusted model (i.e., model 3), a one-point increase on the “pressure to create societal impact” item was associated with a 28% increase in the number QRPs in an HSR publication (Exp B = 1.28, 95% CI [1.11, 1.47]). Conversely, a one-point increase on the “specific training in reporting messages and conclusions” was associated with a 15% decrease in the number QRPs of an HSR publication (Exp B = 0.85, 95% CI [0.7, − 0.94]). A one-point increase on the “co-author conflict of interest” item was associated with a 15% decrease in the number of QRPs in an HSR publication (Exp B = 0.85, 95% CI [0.75, 0.97].

Stratified analyses between first and last authors

Results from stratified analyses between first and last authors, along with results of multivariate analyses using the fully adjusted model, are included in Table 5. A complete description of our stratified analysis with all applied models can be found in the supplementary material 5.

Table 5 Comparison of factors associated with the number of QRPs in reporting of messages and conclusions in HSR publication between first and last authors

For first authors, findings indicated “specific training” reporting messages and conclusions (Exp B = 0.84, 95% CI [0.72, 0.98]) was associated with fewer QRPs. “Feedback culture” at their research institute (Exp B = 1.24, 95% CI [1.05, 1.47]) and “pressure to create societal impact” (Exp B = 1.24, 95% CI [1.02, 1.51]) contribute to a higher number of QRPs. For last authors no significant relationship was identified between factors and QRPs.

Discussion

The aim of this study was to explore the possible association between individual, institutional, and scientific environment factors and inadequacies in the reporting of messages and conclusions in scientific HSR publications.

We identified three factors independently associated with QRPs in the reporting of messages and conclusions in scientific HSR publications i.e., “pressure to create societal impact”, “specific training reporting messages and conclusions” and “co-author conflict of interest”. Stratification between first and last author indicated different factors related to the occurrence of QRPs.

Interpretation

Our results indicated three factors are independently associated with QRPs in the reporting of messages and conclusions in HSR literature. The other factors in the assessed framework, however, are not irrelevant. All included factors may relate to multiple aspects of the publication process and are worth addressing in future studies on QRPs. Our study was explorative, and we therefore recommend further empirical research on the resulting factors.

The association between a higher number of QRPs and the factor “pressure to create societal impact” facilitates important insights on the current research culture of HSR. HSR is often intended for practical intervention [23]. To improve the connection between HSR and policy and practice across the entire field of HSR, researchers are stimulated to spread their findings via societal publications to policy makers, professionals and the public [24]. HSR researchers may anticipate their societal impact when writing their scientific publications. Hence, they may be likely to unconsciously adapt their language and writing to present concrete and actionable conclusions suited to attract the attention of the media or the professional community [25]. It is generally assumed that pressure to create societal impact pushes authors to overstate conclusions in press releases or other societal publications. However, the current findings suggest a possible effect on scientific reporting as well. Currently, researchers may not have the means to responsibly create societal impact or have difficulty aligning their scientific messages with societal messages. Not all researchers are equally equipped for this task due to differences in research experience or practical experience in healthcare. While research and practice in HSR are traditionally intertwined, researchers from other disciplines, like biomedical research, will similarly experience an increasing need to create societal impact. With the increasing attention societal impact by the scientific community at large, future studies addressing scientific reporting should take into account the association between the perception of research impact and reporting in scientific publications.

“Specific training in reporting messages and conclusions” was associated with a lower number of QRPs. The positive association between training and the improvement of reporting skills has been identified in previous studies [26, 27]. The practice of phrasing and reporting is an inherent part of extracting messages and conclusion from results. From our focus groups we know that some researchers receive training that specifically combine these skills. Because the participants self-reported on their level of specific training, our findings highlight that some courses offered by HSR institutions in the Netherlands may provide researchers with helpful tools to improve their writing. Moreover, researchers may be capable in recognizing they need more specific training. Institutions should assure that those who need specific training in reporting messages and conclusions will be able to obtain it.

“Co-author conflict of interest” was associated with a lower number of reporting adequacies. This finding contradicts the assumption that research quality generally decreases when a conflict of interest arises. A possible explanation may be that awareness of a conflict of interest by co-authors may have stimulated a more nuanced or careful interpretation of the research findings. Policies in place at HSR institutions could assure that those conflicts of interests are positively mitigated and result in more attention to research conduct [28].

One method used by research institutions to force a stimulating debate is to introduce structured peer-feedback [29]. Although some institutions in the Netherlands have invested in structuring feedback support for their researchers, feedback culture was not associated with a lower number of QRPs in the current study. Surprisingly, the analyses differentiating between first and last authors indicated that feedback culture may contribute to more QRPs for first authors. Feedback structures differ for each institution, and some might not be aimed sufficiently at the interpretation and reporting of messages and conclusions. It could thus be worthwhile to investigate how feedback structures can better support authors and what type of feedback culture would specifically create a stimulating debate. The assessment form developed for assessing QRPs in scientific publications might guide structured feedback on reported messages and conclusions specifically, and may be retrieved from the supplementary material 1 in our previous open access publication [6].

Our analyses further indicate that factors contributing to QRPs may be different for first and last authors. First authors may contribute to more QRPs when they experience more pressure to create societal impact and a positive feedback culture. They may contribute to fewer QRPs when they receive more specific training in reporting messages and conclusions. Additional research on the unique roles of first and last authors in the prevention of QRPs when reporting messages and conclusions in scientific HSR publications is recommended.

Limitations

The main strength of our approach was our mixed methods design. By constructing a framework from the experiences of a sample of health services researchers, we could tailor the survey to our study participants. Moreover, most research on research integrity is derived from self-report. Our assessment of QRPs provides a more impartial approach.

Considering the large turnover of research staff and PhD students at each institution, a response of 45% may be optimal. The average number of QRPs is similar to that of all assessed HSR publications. Nevertheless, non-respondents might have rejected participation because of time pressures or a lack of communication with the HSR community. The relatively small sample size in the current study also presents as a limitation and necessitates replication in larger and more diverse samples. Further, due to our explorative aim, we decided to include factors with a lower threshold of reliability. In follow-up research we recommend to improve the factors’ reliability.

We acknowledge the publications we analysed are nested in the thirteen participating Dutch institutions, which may influence the associations between institutional factors and QRPs. However, intraclass variation cannot be fully avoided. Institutions differ in structural conditions/resources, social conditions, role of supervisors, and cultural conditions. Consequently, associations may be attributed to an institutional factor although it is dependent on the context of a particular institution. A multilevel analysis including the institution in which the publication is nested would be the ideal option to address this issue. However, such an analysis would likely have required a larger dataset to provide a robust estimation of effects, particularly given the fact that publications are typically written by authors from multiple institutions. Considering the relatively small sample size and the explorative nature of our study, a single-level regression was a more appropriate choice. Further study with a larger data set and clearly distinguished institutions will allow for a more sophisticated analysis technique to confirm findings from our study.

The assessed publications were published in 2016. Researchers might not have had a very vivid memory of their working experience 2 years prior. This would have made our connections between publication and person somewhat less reliable. Nevertheless, we do not expect institutional or scientific factors to have changed significantly in the course of 2 years. Risk for recall bias was likely minimal.

The researchers and studies included in this study all originated from Dutch research institutions. Institutional structures and individual experiences of research culture will be different across countries. For instance, Dutch universities employ PhD students, who are often the primary authors of research papers, rather than provide scholarships. Moreover, they share supervisory structures requiring at least two supervising seniors. Nevertheless, HSR researchers and institutions often deal with similar challenges as those encountered in the Netherlands, including publication pressures and creating societal impact. Aspects of the results from the current study are thus likely to provide a helpful guide for HSR institutions internationally.

Implications and recommendations for policy and practice

In the field of HSR there is potential for a different set of guidelines for assessing QRPs as compared to the biomedical field. Due to the different research questions and methodologies applied, the research studies presented in HSR should adhere to the standards set by reporting guidelines and tools designed specifically for the field [30]. Peer reviewers selected for these studies should therefore be familiar with the standards of the field. Prior to publication, QRPs in the reporting of messages and conclusions may be intercepted during peer review and brought to the attention of the authors. We thus encourage journals to request peer-reviewers to assess a publication for the presence of the discussed QRPs.

Our study moreover identified factors that are best addressed through changes by research institutions. Results should stimulate awareness within the HSR community internationally. In support of a more responsible translation of findings to policy and practice, they should address the identified factors to contribute to better reporting in scientific HSR publications. We recommend the development of institutional interventions to encourage responsible reporting of messages and conclusions in HSR. Specialized writing courses and workshops may increase writing confidence [27]. Specific training already in place at research institutions on writing discussions and conclusions should be extended to all health services researchers who do not have access. HSR institutions should further prioritize providing a positive feedback culture by stimulating debate and making conflicting interests explicit. They should moreover, introduce systematic changes such as organizing peer-review with engaging discussions, and providing sufficient time and support in balancing scientific reporting and creating societal impact. Across the HSR field, institutions are already taking actions to assure responsible research practices, thus we recommend them to strengthen the coherence of their efforts, also by collaboration.

Conclusion

Experienced pressure to create societal impact is associated with a higher number of QRPs in the reporting of messages and conclusions in HSR publications. Specific training in reporting messages and conclusions, and awareness of co-author conflict of interests are related to fewer QRPs in HSR publications. This study was exploratory and we therefore recommend further research on the identified factors. Our results should stimulate awareness within the field of HSR internationally on opportunities better support reporting in scientific HSR publications, and thus a more responsible translation of findings to policy and practice.

Availability of data and materials

The datasets used and/or analysed during the current study are available from the corresponding author on reasonable request. Project files for are available in the Figshare public repository: https://doi.org/10.21942/uva.9255335.

Abbreviations

CI:

Confidence Interval

B:

Beta

Exp(B):

Exponentiation of the Beta coefficient

HSR:

Health Services Research

QRP:

Questionable Research Practices

SD:

Standard Deviation

SE:

Standard Error

ZonMw:

The Netherlands Organization for Health Research and Development

References

  1. Glasziou P, Altman DG, Bossuyt P, Boutron I, Clarke M, Julious S, et al. Reducing waste from incomplete or unusable reports of biomedical research. Lancet. 2014;383(9913):267–76.

    Article  Google Scholar 

  2. Dechartres A, Trinquart L, Atal I, et al. Evolution of poor reporting and inadequate methods over time in 20 920 randomised controlled trials included in Cochrane reviews: research on research study. BMJ. 2017;357:j2490.

  3. Boutron I, Ravaud P. Misrepresentation and distortion of research in biomedical literature. Proc Natl Acad Sci U S A. 2018;115(11):2613–9.

    Article  Google Scholar 

  4. Fletcher RH, Black B. Spin in scientific writing: scientific mischief and legal jeopardy defining scientific misconduct. Med Law. 2007;26:511–26.

    PubMed  Google Scholar 

  5. Horton R. The rhetoric of research. BMJ. 1995;310(6985):985.

    Article  CAS  Google Scholar 

  6. Reijmerink W, Robben P, Ruwaard D, Vermeulen H. Naar gezond zorgonderzoek: interactieve leerervaringen uit het ZonMw-programma Gezondheidszorgonderzoek. Tijdschr Voor Gezondheidswetenschappen. 2014;92(8):319–22..

    Article  Google Scholar 

  7. Lavis JN, Ross SE, Hurley JE, Hohenadel JM, Stoddart GL, Woodward CA, et al. Examining the role of health services research in public policymaking. Milbank Q. 2002;80(1):125–54.

    Article  Google Scholar 

  8. Gerrits RG, Jansen T, Mulyanto J, van den Berg MJ, Klazinga NS, Kringos DS. Occurrence and nature of questionable research practices in the reporting of messages and conclusions in international scientific health services research publications: a structured assessment of publications authored by researchers in the Netherlands. BMJ Open. 2019;9(5):e027903.

    Article  Google Scholar 

  9. Burstein P. The impact of public opinion on public policy: a review and an agenda. Polit Res Quart. 2003;56(1):29–40.

    Article  Google Scholar 

  10. Hanney SR, Gonzalez-Block MA, Buxton MJ, Kogan M. The utilization of health research in policy-making: concepts, examples and methods of assessment. Health Res Policy Syst. 2003;1(2). https://doi.org/10.1186/1478-4505-1-2.

  11. Kuruvilla S, Mays N, Walt G. Describing the impact of health services and policy research. J Health Serv Res Policy. 2007;12(Suppl 1):S1–23-31.

    PubMed  Google Scholar 

  12. Gerrits RG, van den Berg MJ, Klazinga NS, Kringos DS. Statistics in Dutch policy debates on health and healthcare. Health Res Policy Syst. 2019;17(1):55.

    Article  Google Scholar 

  13. Bouter LM. Commentary: perverse incentives or rotten apples? Account Res. 2015;22(3):148–61.

    Article  Google Scholar 

  14. Caulfield T. The commercialisation of medical and scientific reporting. PLoS Med. 2005;1(3):e38.

    Article  Google Scholar 

  15. Altman DG, Simera I, Hoey J, Moher D, Schulz K. EQUATOR: reporting guidelines for health research. Lancet. 2008;371.

  16. Moher D, Glasziou P, Chalmers I, Nasser M, Bossuyt PMM, Korevaar DA, et al. Increasing value and reducing waste in biomedical research: who's listening? Lancet. 2016;387(10027):1573–86.

    Article  Google Scholar 

  17. Editorial. Opening up peer review. Nature. 2018;560(7720):527.

  18. Schrag NJ, Purdy GM. Step up for quality research. Science. 2017;357(6351):531.

    Article  CAS  Google Scholar 

  19. Haven TL, Bouter LM, Smulders YM, Tijdink JK. Perceived publication pressure in Amsterdam: Survey of all disciplinary fields and academic ranks. PLoS One. 2019;14(6).

  20. Crain AL, Martinson BC, Thrush CR. Relationships between the survey of organizational research climate (SORC) and self-reported research practices. Sci Eng Ethics. 2013;19(3):835–50.

    Article  Google Scholar 

  21. Tijdink JK, Verbeke R, Smulders YM. Publication pressure and scientific misconduct in medical scientists. J Empir Res Hum Res Ethics. 2014;9(5):64–71.

    Article  Google Scholar 

  22. O'Cathain A, Murphy E, Nicholl J. The quality of mixed methods studies in health services research. J Health Serv Res Policy. 2008;13(2):92–8.

    Article  Google Scholar 

  23. Bentley P, Kyvik S. Academic staff and public communication: a survey of popular science publishing across 13 countries. Public Underst Sci. 2010;20(1):48–63.

    Article  Google Scholar 

  24. Council for Medical Sciences. The societal impact of applied health research: Towards a quality assessment system. Amsterdam: Royal Netherlands Academy of Arts and Sciences; 2002.

    Google Scholar 

  25. Weingart P. Science and the media. Res Policy. 1998;27(8):869–79.

    Article  Google Scholar 

  26. Rickard CM, McGrail MR, Jones R, O'Meara P, Robinson A, Burley M, et al. Supporting academic publication: evaluation of a writing course combined with writers' support group. Nurse Educ Today. 2009;29(5):516–21.

    Article  Google Scholar 

  27. Kramer B, Libhaber E. Writing for publication: Institutional support provides an enabling environment. BMC Med Educ. 2016;16(1).

  28. Besley JC, McCright AM, Zahry NR, Elliott KC, Kaminski NE, Martin JD. Perceived conflict of interest in health science partnerships. PLoS One. 2017;12(4):e0175643–e.

    Article  Google Scholar 

  29. Fanelli D, Costas R, Lariviere V. Misconduct policies, academic culture and career stage, not gender or pressures to publish, affect scientific integrity. PLoS One. 2015;10(6):e0127556.

    Article  Google Scholar 

  30. Sheikh K, Gilson L, Agyepong IA, Hanson K, Ssengooba F, Bennett S. Building the field of health policy and systems research: framing the questions. PLoS Med. 2011;8(8):e1001073.

    Article  Google Scholar 

Download references

Acknowledgements

We are grateful for the participation of researchers in the interviews, focus groups and survey and the support from their affiliated HSR institutions to perform this study, which include: Erasmus MC, Department of Public Health; Erasmus University, Erasmus School of Health Policy and Management; Leiden University Medical Centre, Department of Medical Decision Making and the Department of Public Health and Primary Care; University Maastricht, Health Services Research group; the Netherlands Institute for Health Services Research (NIVEL); Radboud UMC, IQ healthcare; the National Institute for Public Health and the Environment (RIVM); University of Groningen, Faculty of Economics and Business; Tilburg University, Social and Behavioural Sciences, Tranzo; Trimbos Institute; University Medical Centre Utrecht, Julius Centre for Health Sciences and Primary Care; Amsterdam UMC – Vrije Universiteit Amsterdam, and the Amsterdam UMC - University of Amsterdam. Furthermore, we also would like to thank project adviser Anton Kunst for his valuable feedback on the study design.

Funding

This study was funded by grant number 445001003 from the Netherlands Organisation for Health Research and Development (ZonMw). The funder had no role in the study design, the collection, analysis and interpretation of data, the writing of the manuscript or the decision to submit the paper for publication. All authors had full access to the data during the conduct of the study and they take responsibility for the integrity of the data and the analysis.

Author information

Authors and Affiliations

Authors

Contributions

All authors contributed to the design of the study. RG, JM and JW collected and analysed the data. RG and JM drafted the manuscript. MB, NS, DK and JW were involved in interpreting the results and were major contributors to the writing of the manuscript. All authors read and approved the final manuscript and are accountable for all aspects of the work.

Corresponding author

Correspondence to Reinie G. Gerrits.

Ethics declarations

Ethics approval and consent to participate

Ethical approval was deemed unnecessary by the study’s affiliated protocols (Medical Research Act with People [Wet medisch-wetenschappelijk onderzoek met mensen (WMO) [Dutch]], in BWBR0009408, W.a.S. Ministry of Health, Editor. 1998: Hague, Netherlands). A waiver for ethical approval was therefore obtained for this study from the Medical Ethics Review Committee at Amsterdam UMC. Survey participants provided written consent to participate at the start of the survey. Interview and focus group participants provided verbal consent at the start of all interviews and focus groups. Verbal consent was considered adequate as no human data was retained. To assure voluntary participation, informed consent was reiterated at the start of each focus group and recorded.

Consent for publication

Not applicable.

Competing interests

The authors declare that they have no competing interests.

Additional information

Publisher’s Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Supplementary information

Rights and permissions

Open Access This article is licensed under a Creative Commons Attribution 4.0 International License, which permits use, sharing, adaptation, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons licence, and indicate if changes were made. The images or other third party material in this article are included in the article's Creative Commons licence, unless indicated otherwise in a credit line to the material. If material is not included in the article's Creative Commons licence and your intended use is not permitted by statutory regulation or exceeds the permitted use, you will need to obtain permission directly from the copyright holder. To view a copy of this licence, visit http://creativecommons.org/licenses/by/4.0/. The Creative Commons Public Domain Dedication waiver (http://creativecommons.org/publicdomain/zero/1.0/) applies to the data made available in this article, unless otherwise stated in a credit line to the data.

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Gerrits, R.G., Mulyanto, J., Wammes, J.D. et al. Individual, institutional, and scientific environment factors associated with questionable research practices in the reporting of messages and conclusions in scientific health services research publications. BMC Health Serv Res 20, 828 (2020). https://doi.org/10.1186/s12913-020-05624-5

Download citation

  • Received:

  • Accepted:

  • Published:

  • DOI: https://doi.org/10.1186/s12913-020-05624-5

Keywords