Skip to main content

A mixed methods evaluation assessing the feasibility of implementing a PrEP data dashboard in the Southeastern United States



Alabama is one of seven priority states for the National Ending the HIV Epidemic Initiative due to a disproportionate burden of rural infections. To reverse growing infection rates, the state must increase its focus on prevention efforts, including novel strategies. One such approach is to utilize dashboards that visualize real-time data on the pre-exposure prophylaxis (PrEP) care continuum to assist in prioritizing evidence-based preventative care for those most vulnerable for HIV infection.


We conducted a mixed methods evaluation to ascertain stakeholders’ perceptions on the acceptability, feasibility, appropriateness, and usability of a PrEP care continuum dashboard, as well as gain insight on ways to improve the activities necessary to sustain it. Clinicians, administrators, and data personnel from participating sites in Alabama completed surveys (n = 9) and participated in key informant interviews (n = 10) to better understand their experiences with the prototype data dashboard and to share feedback on how it can be modified to best fit their needs.


Surveys and interviews revealed that all participants find the pilot data dashboard to be an acceptable, feasible, and appropriate intervention for clinic use. Overall, stakeholders find the pilot dashboard to be usable and helpful in administrative efforts, such as report and grant writing; however, additional refining is needed in order to reduce burden and optimize usefulness. Participants voiced concerns about their site’s abilities to sustain the dashboard, including the lack of systematized PrEP protocols and limited funds and staff time dedicated to PrEP data collection, cleaning, and upload.


Study participants from clinics providing HIV prevention services, including PrEP, in Alabama voiced interest in sustaining and refining a data dashboard that tracks clients across the PrEP care continuum. Despite viewing the platform itself as an acceptable, feasible, and appropriate intervention, participants agreed that efforts need to be focused on standardizing PrEP data collection protocols in order to ensure consistent, accurate data capture and that limited funds and staff time are barriers to the sustained implementation of the dashboard in practice.

Peer Review reports


In recent years, nationally coordinated efforts, such as the National HIV/AIDS Strategy and Ending the HIV Epidemic (EHE), have strived to prioritize HIV prevention in regions with high prevalence, specifically through effective biomedical interventions such as HIV pre-exposure prophylaxis (PrEP). Alabama (AL) is one of seven priority states for EHE efforts due to it facing a significant burden of rural HIV diagnoses, as well as various social and economic disparities that disproportionately impact populations vulnerable to infection. In order to more effectively address increasing HIV infection rates, priority must be placed on prevention efforts, such as through identifying and filling gaps in prevention service provision [1, 2].

In the last few years, public data management and visualization systems, such as AIDSVu, have been used to guide and evaluate public health programs, and allow diverse stakeholders the opportunity to engage with, understand, and use data to inform interventions and policymaking [3]. While such platforms have focused heavily on existing population-based HIV data, there is a need to track client-level data across the PrEP care continuum, which would provide real-time, actionable data to community providers and federal agencies attempting to implement evidence-based interventions aimed at reducing HIV incidence.

Data dashboards allow clinicians and managers the ability to visualize and explore data on care processes and outcomes, making them a powerful and supportive tool in decision-making [4]. Dashboards are used in healthcare settings to monitor a variety of healthcare issues, including patient safety, vaccine benefit-risk analysis, quality improvement, opioid overdoses, and inter-departmental coordination [5,6,7,8,9]. Although effectiveness research on data dashboards in HIV care is somewhat limited, recent studies have demonstrated that dashboards hold the potential to increase public access to HIV epidemiological data, enable researchers’ access to HIV data, and enhance clinical care [3, 10, 11]. After implementing health department-based HIV data dashboards in 2011, for instance, New York City saw a 1% increase in linkage-to-care and 16% increase in the percentage of HIV patients with viral load suppression among participating clinics by 2016 [10, 11]. Moreso, the past few years have demonstrated the utility of dashboards in communicating the real-time status of the COVID-19 pandemic, although preliminary studies indicate that the majority of dashboards developed for COVID-19 are only somewhat actionable, primarily due to a lack of focus on identifying specific target audiences [12, 13].

Despite the promise that data dashboards hold in improving HIV care, little is known about how to effectively implement dashboards in clinical settings. Findings from a recent European Union-funded project demonstrate that numerous challenges arise when trying to implement performance dashboards in healthcare settings, including divergent stakeholder expectations; the tension between providing meaningful, accurate, and timely data; and differing dashboard needs and purposes [14]. Other studies have shown that the success of a data dashboard intervention also hinges on the willingness of a site to invest in the intervention financially and with human resources, as well as on the ability of the dashboard to meet certain site requirements. Such requirements include customization capabilities, which would enable users to change display options to best suit preferences, and reporting capabilities, which would allow users to generate visual reports in various formats for site, state, and federal reporting requirements [7, 15]. Studies of other healthcare tools, like electronic health records (EHRs), air-cleaning technology, and smartphone apps reveal similar implementation challenges [16,17,18,19]. Notably, though, despite these barriers, many novel technologies, such as EHRs, have widely been adopted in healthcare setting, suggesting the same may be possible for dashboard use in HIV care [20, 21].

Our project, PrOTECT AL (PrEP Optimization Through Enhanced Continuum Tracking), is a multiphase, participatory study to coalesce community and public health partnerships in an effort to visualize the state’s PrEP care continuum and provide data visualization via a public-facing data dashboard, a new technology in this space. Building on successful preliminary work, we piloted a dashboard that provided PrEP clinicians, state health officials, and community-based organizations aggregate-level data visualizations on persons engaged in PrEP care across the state. The objective of this study is to better understand the perspectives of key stakeholders, including providers, data personnel, and clinic managers, on the feasibility, acceptability, appropriateness, and usability of the prototype PrEP data dashboard, as well as assess what additional modifications are needed to enhance platform usability, ensure adoption, and facilitate implementation. We believe this project will lay the necessary groundwork for future identification of gaps in the PrEP care continuum.


The current study was a prospective investigation using a mixed methods approach to evaluate key stakeholders’ perceptions on the feasibility, acceptability, appropriateness, and usability of the pilot PrEP data dashboard. Future stages of this study will evaluate the ability of the dashboard to provide policymakers and providers with actionable data to identify and mitigate any gaps in services. Participation in PrOTECT AL was offered to members of the Alabama Quality Management Group (AQMG), a consortium of 13 clinics who receive Ryan White HIV/AIDS parts C and D program funding in AL. Ten sites expressed interest in participation, but over the course of the study, two sites were unable to continue engagement due to lack of staff and competing priorities. Thus, eight sites from across the state participated in the project: AIDS AL South (Mobile), Health Services Center (Anniston), Five Horizons Health Services (formerly called Medical Advocacy and Outreach in Montgomery), Thrive Alabama (Huntsville), UAB Family Clinic (Birmingham), the UAB 1917 Clinic (Birmingham), Unity Wellness Center (Opelika), and Whatley Health Services (Tuscaloosa). These sites provide a variety of HIV treatment and prevention services and are geographically dispersed throughout the state (Fig. 1). While these sites receive Ryan White funding for patients living with HIV, per the Ryan White Care Act Policy, the use of these funds for PrEP medications and related preventative services is limited [22]. Despite these limitations, the clinics’ PrEP care footprint within the state is quite large in that they provide almost half of all PrEP care across the entire state of AL [3]. Therefore, prior to this study, our investigative team, in collaboration with AQMG partners and the state health department, engaged in an implementation mapping process to better understand how we could leverage PrEP data captured across AQMG organizations to identify service delivery gaps in real-time. The PrOTECT AL data dashboard was felt to be the key implementation strategy necessary to critically visualize and assess real-time data that would promote PrEP prescription [23].

Fig. 1
figure 1

Map of HIV Prevalence in Alabama and participating Ryan White HIV/AIDS Program Clinics [3]. Created with QGIS [24]

Using findings from surveys, interviews, and focus group discussions in earlier iterations of the study, the research team developed a pilot version of a PrEP data dashboard and a corresponding clinic-facing website [23]. To develop this dashboard, our partners completed a test data upload that corresponded with an agreed-upon data codebook developed via focus group discussions; this upload included three months of clinic data on clients seeking STI/HIV testing services and PrEP clients. Data Use Agreements (DUAs) with each site were established prior to data submission and detailed the agreed-upon data elements to be collected. Clinics were provided with a Data Management Protocol, a document designed to guide participating sites through the goals and responsibilities for participation, as well as detail the upload process. Sites were also given a final copy of the data dictionary with the elements necessary for upload and an upload template that would allow for accurate, consistent data entry for the development of the dashboard. Clinics were given four weeks to compile and upload this data. Six of the eight PrOTECT AL partners submitted clean, usable data for this project.

PrOTECT AL dashboard features

The PrOTECT AL website consists of a public-facing homepage with information on the purpose of the project, resources for finding PrEP across the state, information on project partners, and reading materials on HIV and prevention efforts across AL. From the website, partners can access the data dashboard via a clinic-specific login where they can view both aggregate data and their clinic-level data. As the goal of the dashboard is to visualize the PrEP Care Continuum for participating sites in Alabama, as described by the CDC’s Continuum of PrEP Care (Fig. 2), data elements include, but are not limited to: number of patients referred for, linked to, prescribed, or discontinued from PrEP within the reporting period; primary risk factors recorded; and AL counties served. These elements can be filtered down individually by race, age, and gender. A copy of the full data dictionary can be found in Appendix A.

Fig. 2
figure 2

Continuum of PrEP Care. Modified from CDC [25]

Following development, partners were given approximately two weeks to engage with the dashboard and website to provide feedback on the acceptability and usability of the data dashboard, as well as perceptions on the feasibility of adopting this dashboard in their clinic.

AIM, IAM, FIM survey and SUS survey

Following development and engagement with the pilot PrEP data dashboard, 10 stakeholders across the eight participating sites were asked to complete an anonymous survey via Qualtrics, an online survey platform. An overview of participation can be found in Table 1.

The goal of this survey was to measure stakeholders’ perceptions on the acceptability, appropriateness, and feasibility of the PrEP data dashboard, as well as the usability of the platform.

Table 1 Overview of participation per site

The survey included the Acceptability of Intervention Measure (AIM), Intervention Appropriateness Measure (IAM), and Feasibility of Intervention Measure (FIM), each of which is a four-item measure of implementation outcomes that are often considered “leading indicators” of implementation success [26]. Definitions for these constructs can be found in Table 2 [27]. The survey also included the System Usability Scale (SUS), a ten-item questionnaire to help provide a global view of subjective assessments of the usability of a system [28]. Each of these measures consists of a five-point Likert scale ranging from “Completely/Strongly Disagree (1)” to “Completely/Strongly Agree (5)”.

Table 2 Acceptability, appropriateness, and feasibility framework

Semi-structured interviews

To better understand the acceptability of the intervention, an in-depth interview guide informed by constructs of acceptability articulated by Sekhon, et al., was developed [29]. The Acceptability Framework defines acceptability as a “multi-faceted construct that reflects the extent to which people delivering a healthcare intervention consider it to be appropriate, based on anticipated or experiential cognitive and emotional responses to the intervention” [29]. This framework was chosen because it measures prospective, concurrent, and/or retrospective assessment of acceptability based on an individual stakeholder’s subjective evaluation of the intervention. It is important to note that this framework differs from the AIM, IAM, FIM framework, as proposed by Proctor et al., as it considers “appropriateness” to be part of “acceptability.”

While the interviews primarily focused on the dashboard itself, we also asked about perceptions on the activities and processes necessary to develop and maintain the dashboard. Interview questions also addressed general feelings towards the public-facing PrOTECT AL website.

Interviews were conducted with a purposive sample of key stakeholders from seven of the eight partner sites; a representative from one of the partner sites was unable to be reached to participate in an interview. Stakeholders include administrative staff, data personnel, and providers who have engaged with the pilot version of the data dashboard. Ten semi-structured interviews were conducted from August 8, 2022 to August 30, 2022 lasting an average of 35 min. The interviews were conducted over Zoom by a member of the Evaluation Unit within the Research and Informatics Service Center (RISC). These interviews were recorded on Zoom; transcribed via Rev, an online transcription service; and coded via NVivo Qualitative Data Analysis software.

This study’s protocols, documents, and forms, including a waiver of documentation for informed consent, were approved by the University of Alabama at Birmingham Institutional Review Board (IRB-300,004,157). Informed consent was obtained verbally and recorded as part of the transcript.


Survey responses were scored according to their instrument’s scoring criteria. AIM, IAM, and FIM survey scale values range from one (1) to five (5), with no reverse scoring. Scores are then summed per response and averaged. SUS survey responses are summed and weighted to determine an overall score per response. Final scores can range from 0 to 100, with 0 indicating no usability and 100 indicating perfect usability [24].

Each interview was coded via NVivo Qualitative Data Analysis software by two coders from the Evaluation Unit within RISC. To help inform qualitative analysis, the research team employed deductive analysis and was guided by constructs from the Acceptability Framework: (1) Affective Attitude; (2) Burden; (3) Ethicality; (4) Intervention Coherence; (5) Opportunity Costs; (6) Perceived Effectiveness; and (7) Self-efficacy. Two of the seven codes, Ethicality and Intervention Coherence, were not used in analysis, as they were not applicable to the purpose of this study.

Following initial coding, the research team developed additional subcodes based on recurring findings from the interviews. For example, the subcode Proposed Modifications & Additions was added to better grasp stakeholders’ ideas for refining the platform. Given the large volume of references, we added further subcodes to better group ideas around changes or additions to the intervention and intervention activities: aesthetics, data, data collection procedures, features, public-facing information, upload process, and other/miscellaneous. The most common subcodes among these involve proposed modifications or additions to available features on the dashboard and public-facing information. Similarly, to better understand information on Burden and Perceived Effectiveness, we added the subcodes time, personnel, data capture, and intervention materials and good fit, missing or incomplete data, reporting, and visualization, respectively.

Fig. 3
figure 3

AIM Responses

Fig. 4
figure 4

IAM Responses

Fig. 5
figure 5

FIM Responses

The coders met three times throughout the analysis process to ensure inter-coder reliability, which is evidenced by a 0.75 kappa coefficient.

Table 3 reflects the analysis framework and corresponding interview questions. A copy of the full interview guide is available in Appendix B. A full list of the codes and subcodes, along with their definitions, can be found in Appendix C.

Table 3 Acceptability Framework Constructs and Corresponding Interview Questions


Survey findings

On the AIM (Mean = 4.5), IAM (M = 4.3), and FIM (M = 4.9) surveys, all participants either Agreed or Completely Agreed with each statement. Only one statement, “The PrOTECT AL Data Dashboard seems to be a good match,” received a mixed response, with one of the nine respondents reporting “neither agree nor disagree.” This indicates that the overwhelming majority of stakeholders find the platform to be acceptable, appropriate, and feasible as an intervention activity. Figures 3 and 4, and 5 show responses to the AIM, IAM, and FIM measures, respectively.

According to findings from the SUS survey, the piloted dashboard averaged 79%. According to SUS scoring interpretations, this score, as represented by the red line in Fig. 6, indicates general acceptability and usability by participants. Figure 6 also illustrates the ways in which this score can be interpreted, including the percentile, grade, and adjectives typically used to describe systems of that score. Further, this figure demonstrates that this score is considered “acceptable” by participants and that the dashboard rates as a promoter, meaning that it is likely to be recommended to others by these users.

Fig. 6
figure 6

SUS Scoring and Interpretation

Interview findings

During interviews and analysis, we identified several primary and secondary findings. Table 4 summarizes our main findings by Acceptability Framework construct.

Affective attitude

Overall, participants found the dashboard and corresponding website to be visually appealing, easy to navigate, and a useful tool for clinic operations. One participant noted:

I really like the initiative and the project. I think it’s, you know, good information, not only for the patient, but for us, you know, as clinical staff, helping [and] providing better care to our patients.

However, many participants recommended including additional features to make the dashboard even more useful for clinic operations. Participants seemed most interested in adding options to cross-filter demographic characteristics, such as race, gender, and age group, noting that it would help not only with reporting and applying for grants, but also with sharing information with colleagues in other departments. Nevertheless, participants particularly enjoyed having a quick, simple view of aggregate data and their clinic’s data.

Table 4 Coded findings in the Acceptability Framework. Primary findings bolded

Overall, participants found the public-facing website to be very clean, clear, and visually appealing as a user and viewer; however, many felt it was lacking more useful information, not only for clinicians but for potential PrEP users, as well. One participant stated, “The resources page, I just felt like it needed more information,” and recommended including frameworks and information outside of EHE objectives. Similarly, two other participants expressed interest in adding the full CDC recommendations for PrEP on either the front-facing page or behind the dashboard login.

The majority of participants found the color scheme of the website and dashboard to be warm and welcoming, however, one participant noted that the abundance of red “subconsciously signals bad or incorrect or wrong.”

When asked about additional modifications or features they would like to see on the dashboard, participants suggested: adding a heat map to the dashboard to better visualize PrEP hotspots across the state; creating a page to house project-related documentation, such as the upload template, codebook, and data management protocol; providing asynchronous trainings that can be accessed on-demand; creating a PrEP working group with regularly occurring meetings; creating a standard template for data collection across sites; and developing a less cumbersome upload template.

Perceived effectiveness

When asked if they thought the dashboard would be useful to their clinic’s operations, one participant stated:

It absolutely 100% has already proven to be beneficial. So, I know that it would be [even better] […] if the dashboard was expanded to include the information that y’all gathered from all these other sites and we were finally able to implement this on like a policy level or, you know, a clinic level, so that it was something that was within our clinic flow that we knew that we were doing and capturing and continuing to work on. I literally do not foresee any negatives. I only see positives for the grant writing, for better understanding our community and our clients, for having more in-depth conversations with those clients—making them feel more comfortable to be able to talk to us and to share that information—and, honestly, just better understanding where PrEP stands and where we are at as far as the progression of PrEP within our community.

Similarly, another participant noted:

I think it’d certainly be helpful with reporting to, you know, be able to compare and contrast like this is what we know locally versus what the data we’ve gathered kind of more statewide. But also, I think, for education and advocacy, it’s really great to know those numbers.

Further, participants found the dashboard useful in advocating and educating the community on PrEP, with one participant stating:

How can we make PrEP important? So, you know, I think those numbers, they go beyond clinical care, they go, you know, to advocacy, to education and, I think, really could be used to make a difference. ‘Cuz, you know, I have a friend who says you don’t count if you’re not counted. So, if we don’t count PrEP, if we don’t look at this data, it doesn’t matter, and we need it to matter.

Despite the majority of participants finding the dashboard itself to be a good fit and useful to their clinics, many noted that the intervention activities, such as the data collection and upload process, did not fit into clinic operations and workflow very easily. Many participants, particularly data personnel who held primary responsibility for gathering, organizing, and uploading this data, found the process cumbersome and time-consuming as each clinic has a different protocol for handling and collecting data on PrEP patients.

Participants also noted that clinic staff do not consistently capture all data elements, such as patient-reported outcome (PRO) data, which include questions on food security, transportation security, and housing status. Therefore, despite agreeing to and expressing interest in the dashboard displaying PRO data, such data is difficult, if not impossible, to accurately provide for upload to the dashboard. Despite these issues, once sites captured what data they could, they found the upload process via ShareFile, a secure content collaboration and file-sharing software, to be simple and straightforward.

Moreover, while clinics encountered difficulty collecting this data, many noted that it speaks to the greater issue of a lack of federal focus on prioritizing prevention efforts to address the HIV epidemic. One participant noted:

This has been one of my biggest complaints I will say. And concerns just like with our general HIV workforce in the state of Alabama, is that our education hub…I feel like sometimes fails us. And then our state health department, same thing. And I think anybody would generally feel that way if they worked directly with them and saw kind of some of the inconsistencies…you know, this is the dashboard you can use to… we need, we find, we have comprehensive sexual health education approved in the state of Alabama. We need a curriculum for that that we’re all using, not just the schools. So many things just like that, that I think should, for those of us that have been doing this for so long, we should truly have an end all be all to this by now. And we just don’t and I think this is a true testament to that.


When asked about burden incurred throughout the study, participants seemed most concerned about which data elements they were and were not able to capture for PrOTECT, as well as how they had to go about collecting all the necessary data. One interviewee described the difficulties, saying:

I think for probably most people, the most difficult piece is, like, gathering the data and then, like, getting it into the right format, and I don’t know that there’s anything that you all could do because of the fact that where everybody’s getting their information from is different.

Similarly, many participants found collating the data into the provided upload template was cumbersome, with one participant noting, “I think because your document was so long, Excel was probably not ideal. Yeah, your codebook was so long.”

Many interviewees described how their clinic’s EHR makes it difficult to collect certain data elements; they have to search different areas of their EHR to collect the information (e.g., provider notes), and, depending on the capabilities of the system, they may have to request assistance from their vendor. Interviewees typically didn’t blame the study’s protocol for these difficulties but accepted them as inevitable, with one participant stating, “I don’t know a better way either. You know, like sometimes there are just things you have to deal with and overcome because there is no better way.”

An adjacent burden interviewees ran into with the intervention was the time it took to collect and upload data to the dashboard. One person explained:

This took up a lot of my own personal time as I was that main point of contact. And I was jumping back through serology forms or the EHR, whatever it may be, to gather this patient information and to put it all together.

Even though interviewees found the intervention to be somewhat burdensome in regard to data capture and time requirements, most agreed they would continue with the intervention because of the benefits it provided, like data for grant writing. One interviewee explained:

It was helpful for me, but it was just overall, it was just time consuming is what it was. And some things, you know, that deserve the time, we’ll get the time. And the things that really will turn out to be beneficial to us and that you put the time into, then it’s all worth it.

Another provider described in a bit more detail why the intervention is worth the burden to them:

So, I like the HIV testing information we gather, anyways, because I use it for different grants and stuff that we turn those numbers in [for]. So for me, I would have no issue continuing to report that information. That would not be difficult for me to do it all.

Despite the barriers faced while trying to implement the intervention, and the sometimes limited effectiveness, it appears most participants feel they are adequately able to perform the necessary intervention tasks. Overall, most interviewees felt they were effectively able to capture the necessary data and upload it in a timely manner despite challenges. One interviewee explained:

If it’s something that we need, we just have to figure out a way to capture it properly on our end. I mean, it’s, I guess it’s doable, you just have to find a way.

A couple of participants discussed how, with the competing demands of healthcare provision, having regularly scheduled deadlines for data uploads would allow participants to adequately plan ahead and mitigate any data capture and upload challenges. One interviewee described the best plan for ensuring their site’s continued participation:

My answer is if you continue to tell us that we need data every three months or every quarter, that is gonna be your best bet because, one, that’ll be a reminder to me, like [to] get the data. ‘Cause otherwise it’s like waiting on me to have moments of freedom, which don’t really happen anymore. […] But if you send out the [reminder] every three months or whatever y’all’s timeframe is, I don’t think that’s a problem. I think we could definitely continue and would like to continue adding to the dashboard. I think it’s more so streamlining the way in which we can collect the data.

Integration of qualitative and quantitative findings

The merging of the data allowed us to identify areas of concordance and discordance between qualitative and quantitative results. Interview and survey data agreed in terms of general acceptance and perceptions of appropriateness, feasibility, and usability of the data dashboard; however, it is important to note that surveys did not explicitly ask participants about their perceptions of the activities and protocols necessary to maintain the data dashboard. While the surveys indicate overwhelmingly positive perceptions, expansive interviews that explore implementation activities reveal that participants encountered significant burden when collating data elements for upload. A display of concordant and discordant findings across surveys and interviews is shown in Table 5.

Table 5 Concordance and discordance of quantitative and qualitative findings. Quantitative surveys consist of 5-point Likert scale from “complete disagree (1)” to “completely agree (5)”


Our findings demonstrate that, across Alabama, the participating Ryan White HIV/AIDS Program-funded clinics are interested and willing to contribute to a PrEP data dashboard. All participating sites found the pilot data dashboard to be feasible, acceptable, and appropriate, and scored the overall system as usable. Further, stakeholders reported that such a dashboard would be useful for clinic operations, specifically in regard to internal and external report writing, clinic presentations, and drafting grant applications. Dashboards are specifically designed to enhance this kind of data communication, as visualization assists with processing and retaining complex information [30, 31]. Public health dashboards have proliferated over the last several years in the wake of the COVID-19 pandemic, but have been used to communicate data for a variety of other diseases and health issues [13]. AIDSVu, for example, demonstrates that data visualization efficiently communicates regional HIV burden and associated risk factors, as well as gaps in HIV service provision [3, 32, 33].

Despite finding the dashboard itself feasible, acceptable, and appropriate, stakeholders in our study found the procedures necessary to further develop and maintain the dashboard to be burdensome. Stakeholders highlighted key limitations within their existing clinic capacity and workflow that would impact adoption of a PrEP data dashboard, most notably the lack of a standardized PrEP protocol. To stakeholders, this protocol would include a statewide process for assessing PrEP need and providing PrEP care, as well as collecting data elements for the data dashboard. For the majority of these clinics, little attention is given to prevention efforts due to lack of funding and staff availability. For instance, due to funding restrictions, these clinics can only use Ryan White funds for patients living with HIV, not for PrEP clients. Additionally, many data elements necessary to build out the data dashboard, such as PRO data, are not captured, or are not captured systematically, at many sites due to limited capacity. Some stakeholders also noted that not every client eligible for PrEP comes in specifically for PrEP screening, thereby making it difficult to initiate PrEP-specific questions or protocols.

While the lack of a standardized protocol for PrEP patients generates obstacles, the data most clinics have available can create their own set of challenges. Specifically, clinics reported how time-consuming and difficult it was to locate relevant data elements in their EHRs; if the items were found, additional time was needed in order to collate and enter these elements into the provided data template for upload, time that many clinics do not have due staffing limitations.

These sorts of challenges are not unique to the implementation of our dashboard. A 2015 scoping review found that one primary barrier to the implementation of hospital performance dashboards is poor data quality, and scholars have argued that dashboard implementation is hospitals will likely be unsuccessful without the appropriate financials, human resource investments, and close collaboration with those who will be using the dashboard in the clinic [7, 34, 35]. Similar implementation challenges exist for other healthcare technologies, as well. For instance, studies have documented that barriers to transitioning from paper records to EHRs include the prohibitive costs, technical concerns, and insufficient time [19, 36, 37]. Despite these challenges, though, EHRs are now used in nearly 80% of office-based physicians and 96% of hospitals, showing that initially disruptive technologies, like dashboards, can eventually become widely-used tools in healthcare [20].

Interestingly, despite expressing the desire for a PrEP data dashboard to expedite identification of gaps in the PrEP care continuum, participants did not discuss this during the course of our study, nor did they address how, or if, the dashboard would change clinic practice. Future exploration is needed to identify how to make a PrEP data dashboard more informative and useful to clinic operations resulting in clinic practice transformation, as well as in efforts to advocate for funding of HIV prevention data collection. Without nation-wide policies (i.e., a national PrEP program) that are far reaching into states with rural epidemics, like AL, new and innovative strategies will continue to be needed to precisely identify gaps allowing for the expedited allocation of public health resources. It is beyond the scope of this current pilot to evaluate the costs, in terms of person-hours, software, and infrastructure support needed to sustain the dashboard; however, it is broadly accepted and hypothesized that to ostensibly end the epidemic, an integrated health systems approach that prioritizes the allocation of services as well as resources based on disease burden will be necessary [38].

Despite the implementation challenges for the PrOTECT AL dashboard, stakeholders in our study reported that, given the proper funding and staff, their clinics would benefit from providing PrEP data and accessing a data dashboard. Visualizing where clients fall on the PrEP care continuum via a data dashboard is a significant first step in in identifying and closing gaps in the PrEP care continuum in AL, which will move us closer to achieving the ultimate goal ending the HIV epidemic.


Our study included a number of limitations. Most importantly, we initiated PrOTECT in March of 2020 in the midst of the COVID-19 pandemic, a time of increased stress and anxiety for healthcare providers. Two clinics were unable to continue involvement due to lack of staff and competing priorities, and two clinics were unable to provide clean, usable data for similar reasons, thus leaving a smaller sample for the pilot. Because many of these clinics have few staff members, many of whom serve multiple roles, there were few eligible stakeholders from whom we could collate and report data. Therefore, we collected surveys and conducted key informant interviews via a convenience sample, which could present sampling bias. Clinic leadership identified stakeholders including PrEP providers, administrative staff, and data personnel. This convenience sample could limit the representativeness of our findings. Lastly, our study was limited by only working with Ryan White-funded clinics in the state. Other clinics who may provide PrEP were not included.


The US continues to face high rates of HIV diagnoses with disproportionate burden on rural, Southern states, like Alabama. In order to reduce the number of new infections, more attention must be given on scaling-up effective prevention interventions, like PrEP. Health technologies, such as data dashboards, have the potential to identify gaps in service provision across the PrEP continuum, as well as opportunities to implement evidence-based interventions tailored to the local context. In general, our stakeholders agree that our prototype PrEP data dashboard is a feasible, appropriate, acceptable, and usable intervention for better understanding where patients fall on the PrEP care continuum across the state of Alabama and where action is needed to keep those most vulnerable engaged in HIV preventative care. Despite this, stakeholders identified that more focus is needed on standardizing PrEP assessment, care, and data collection in order to ensure consistent, accurate data capture.

Data Availability

All data is stored in a secure location. Data may be shared, without patient identifiers, contingent upon approval by the UAB Institutional Review Board. For inquiries, please contact the corresponding author.





Ending the HIV Epidemic


Pre-exposure prophylaxis


Patient-reported outcomes


Electronic health record


  1. Mishra S, Silhol R, Knight J, Phaswana-Mafuya R, Diouf D, Wang L, et al. Estimating the epidemic consequences of HIV prevention gaps among key populations. J Int AIDS Soc. 2021;24(Suppl 3):e25739.

    Article  PubMed  PubMed Central  Google Scholar 

  2. Beyrer C, Adimora AA, Hodder SL, Hopkins E, Millett G, Mon SHH, et al. Call to action: how can the US ending the HIV Epidemic initiative succeed? Lancet. 2021;397:1151–6.

    Article  CAS  PubMed  PubMed Central  Google Scholar 

  3. Sullivan PS, Woodyatt C, Koski C, Pembleton E, McGuinness P, Taussig J, et al. A Data visualization and dissemination resource to support HIV Prevention and Care at the local level: analysis and uses of the AIDSVu Public Data Resource. J Med Internet Res. 2020;22:e23173.

    Article  PubMed  PubMed Central  Google Scholar 

  4. Randell R, Alvarado N, McVey L, Ruddle RA, Doherty P, Gale C et al. Requirements for a quality dashboard: Lessons from National Clinical Audits. AMIA Annu Symp Proc. 2019; 2019:735 – 44.

  5. Dowding D, Randell R, Gardner P, Fitzpatrick G, Dykes P, Favela J, et al. Dashboards for improving patient care: review of the literature. Int J Med Inform. 2015;84:87–100.

    Article  PubMed  Google Scholar 

  6. Bollaerts K, De Smedt T, Donegan K, Titievsky L, Bauchau V. Benefit-risk monitoring of vaccines using an interactive dashboard: a methodological proposal from the ADVANCE Project. Drug Saf. 2018;41:775–86.

    Article  PubMed  PubMed Central  Google Scholar 

  7. Buttigieg SC, Pace A, Rathert C. Hospital performance dashboards: a literature review. J Health Organ Manag. 2017;31:385–406.

    Article  PubMed  Google Scholar 

  8. Wu E, Villani J, Davis A, Fareed N, Harris DR, Huerta TR, et al. Community dashboards to support data-informed decision-making in the HEALing communities study. Drug Alcohol Depend. 2020;217:108331.

    Article  CAS  PubMed  PubMed Central  Google Scholar 

  9. Egan M. Clinical dashboards: impact on workflow, care quality, and patient safety. Crit Care Nurs Q. 2006;29:354–61.

    Article  PubMed  Google Scholar 

  10. Braunstein SL, Coeytaux K, Sabharwal CJ, Xia Q, Robbins RS, Obeng B, et al. New York City HIV Care Continuum dashboards: using Surveillance Data to improve HIV Care among people living with HIV in New York City. JMIR Public Health Surveill. 2019;5:e13086.

    Article  PubMed  PubMed Central  Google Scholar 

  11. Joshi A, Amadi C, Katz B, Kulkarni S, Nash D. A human-centered platform for HIV Infection reduction in New York: development and usage analysis of the ending the epidemic (ETE) dashboard. JMIR Public Health Surveill. 2017;3:e95.

    Article  PubMed  PubMed Central  Google Scholar 

  12. Barbazza E, Ivankovic D. What makes COVID-19 dashboards actionable? Lessons learned from international and country-specific studies of COVID-19 dashboards and with dashboard developers in the WHO European Region. Eur J Pub Health. 2021; 31.

  13. Ivankovic D, Barbazza E, Bos V, Brito Fernandes O, Jamieson Gilmore K, Jansen T, et al. Features constituting Actionable COVID-19 dashboards: descriptive Assessment and Expert Appraisal of 158 public web-based COVID-19 dashboards. J Med Internet Res. 2021;23:e25682.

    Article  PubMed  PubMed Central  Google Scholar 

  14. van Elten HJ, Sulz S, van Raaij EM, Wehrens R. Big Data Health Care innovations: performance dashboarding as a process of collective sensemaking. J Med Internet Res. 2022;24:e30201.

    Article  PubMed  PubMed Central  Google Scholar 

  15. Rabiei R, Almasi S. Requirements and challenges of hospital dashboards: a systematic literature review. BMC Med Inform Decis Mak. 2022;22:287.

    Article  PubMed  PubMed Central  Google Scholar 

  16. Alvarenga MOP, Dias JMM, Lima B, Gomes ASL, Monteiro GQM. The implementation of portable air-cleaning technologies in healthcare settings - a scoping review. J Hosp Infect. 2023;132:93–103.

    Article  CAS  PubMed  Google Scholar 

  17. Schoville RR, Titler MG. Guiding healthcare technology implementation: a new integrated technology implementation model. Comput Inf Nurs. 2015;33:99–107. quiz E1.

    Article  Google Scholar 

  18. Struik MH, Koster F, Schuit AJ, Nugteren R, Veldwijk J, Lambooij MS. The preferences of users of electronic medical records in hospitals: quantifying the relative importance of barriers and facilitators of an innovation. Implement Sci. 2014;9:69.

    Article  PubMed  PubMed Central  Google Scholar 

  19. Zandieh SO, Yoon-Flannery K, Kuperman GJ, Langsam DJ, Hyman D, Kaushal R. Challenges to EHR implementation in electronic- versus paper-based office practices. J Gen Intern Med. 2008;23:755–61.

    Article  PubMed  PubMed Central  Google Scholar 

  20. National Trends in Hospital and Physician Adoption of Electronic Health Records HealthIT. gov: Office of the National Coordinator for Health Information Technology 2021; Available from:

  21. Adler-Milstein J, Holmgren AJ, Kralovec P, Worzala C, Searcy T, Patel V. Electronic health record adoption in US hospitals: the emergence of a digital advanced use divide. J Am Med Inform Assoc. 2017;24:1142–8.

    Article  PubMed  PubMed Central  Google Scholar 

  22. Public Health Service Act, 42. Title XXVI HIV Health Care Services Program U.S.C. §§ 2601–2695. (2015).

  23. Creger T, Burgan K, Turner WH, Tarrant A, Parmar J, Rana A, et al. Using implementation mapping to ensure the success of PrEP optimization through enhanced Continuum Tracking (PrOTECT) AL-A Structural intervention to Track the Statewide PrEP Care Continuum in Alabama. J Acquir Immune Defic Syndr. 2022;90:161–S6.

    Article  Google Scholar 

  24. QGIS Association.; 2023; Available from:

  25. HIV Pre-exposure Prophylaxis (PrEP) Care System. National Center for HIV, Viral Hepatitis, STD, and TB Prevention.; 2023; Available from:

  26. Weiner BJ, Lewis CC, Stanick C, Powell BJ, Dorsey CN, Clary AS, et al. Psychometric assessment of three newly developed implementation outcome measures. Implement Sci. 2017;12:108.

    Article  PubMed  PubMed Central  Google Scholar 

  27. Proctor E, Silmere H, Raghavan R, Hovmand P, Aarons G, Bunger A, et al. Outcomes for implementation research: conceptual distinctions, measurement challenges, and research agenda. Adm Policy Ment Health. 2011;38:65–76.

    Article  PubMed  Google Scholar 

  28. Brooke J. SUS: a quick and dirty usability scale. Usability Eval Ind. 1995; 189.

  29. Sekhon M, Cartwright M, Francis JJ. Acceptability of healthcare interventions: an overview of reviews and development of a theoretical framework. BMC Health Serv Res. 2017;17:88.

    Article  PubMed  PubMed Central  Google Scholar 

  30. Grant R. Data visualization: charts, maps, and interactive graphics. Boca Raton: CRC Press, Taylor & Francis Group; 2019.

    Google Scholar 

  31. Sosulski K. Data visualization made simple: insigts into becoming visual. New York, New York: Routledge; 2019.

    Google Scholar 

  32. Marshall BDL, Yedinak JL, Goyer J, Green TC, Koziol JA, Alexander-Scott N. Development of a Statewide, publicly accessible Drug Overdose surveillance and information system. Am J Public Health. 2017;107:1760–3.

    Article  PubMed  PubMed Central  Google Scholar 

  33. Waye KM, Yedinak JL, Koziol J, Marshall BDL. Action-focused, plain language communication for Overdose prevention: a qualitative analysis of Rhode Island’s Overdose surveillance and information dashboard. Int J Drug Policy. 2018;62:86–93.

    Article  PubMed  Google Scholar 

  34. Ghazisaeidi M, Safdari R, Torabi M, Mirzaee M, Farzi J, Goodini A. Development of performance dashboards in Healthcare Sector: key practical issues. Acta Inf Med. 2015;23:317–21.

    Article  Google Scholar 

  35. van Deen WK, Cho ES, Pustolski K, Wixon D, Lamb S, Valente TW, et al. Involving end-users in the design of an audit and feedback intervention in the emergency department setting - a mixed methods study. BMC Health Serv Res. 2019;19:270.

    Article  PubMed  PubMed Central  Google Scholar 

  36. Arikan F, Kara H, Erdogan E, Ulker F. Barriers to Adoption of Electronic Health Record Systems from the perspective of nurses: a cross-sectional study. Comput Inf Nurs. 2021;40:236–43.

    Google Scholar 

  37. Kruse CS, Kristof C, Jones B, Mitchell E, Martinez A. Barriers to Electronic Health Record Adoption: a systematic literature review. J Med Syst. 2016;40:252.

    Article  PubMed  PubMed Central  Google Scholar 

  38. Assefa Y, Gilks CF. Ending the epidemic of HIV/AIDS by 2030: will there be an endgame to HIV, or an endemic HIV requiring an integrated health systems response in many countries? Int J Infect Dis. 2020;100:273–7.

    Article  PubMed  Google Scholar 

Download references


The authors would like to thank all of the clinicians, data managers, social workers, staff, and patients of Alabama Quality Management Group-associated clinics.


This research was funded by the National Institute of Allergy and Infectious Diseases (P30 AI027767).

Author information

Authors and Affiliations



LE oversaw manuscript concept development, interpretation of results, and drafting manuscript. KB conducted surveys and interviews, analyzed results, and drafted manuscript. GM assisted with analyzing results, drafting manuscript, editing, and formatting the manuscript. AG and BP assisted in study conduct, interpretations of SUS results, and design of data dashboard. SH, KK, ST, TB, and KT contributed to development of data dashboard and revisions to manuscript. MM and AR assisted with study concept development and revisions to manuscript. All authors have read and approved this manuscript.

Corresponding author

Correspondence to Latesha Elopre.

Ethics declarations

Ethics approval and consent to participate

This study’s protocols, documents, and forms were approved by the University of Alabama at Birmingham (UAB) Institutional Review Board (IRB). The approved IRB protocol number for the present study is IRB-300004157. Due to the ongoing COVID-19 pandemic, all interviews were conducted virtually; therefore, the study team received IRB approval to waive documentation of informed consent. Written informed consent was deemed unsafe due to the COVID-19 pandemic. Participants were given an information sheet during recruitment to review and ask questions prior to sharing the survey or scheduling the interview. During the interview, informed consent was obtained verbally and recorded as part of the transcript.

Consent for publication

Not applicable.

Competing interests

The authors declare no competing interests.

Additional information

Publisher’s Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Electronic supplementary material

Below is the link to the electronic supplementary material.

Supplementary Material 1:

The attached supplementary materials include a copy of the Data Dictionary provided to sites for clinical uploads to the dashboard, a copy of the qualitative interview guide, and a copy of the codebook developed during analysis of qualitative interviews

Rights and permissions

Open Access This article is licensed under a Creative Commons Attribution 4.0 International License, which permits use, sharing, adaptation, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons licence, and indicate if changes were made. The images or other third party material in this article are included in the article’s Creative Commons licence, unless indicated otherwise in a credit line to the material. If material is not included in the article’s Creative Commons licence and your intended use is not permitted by statutory regulation or exceeds the permitted use, you will need to obtain permission directly from the copyright holder. To view a copy of this licence, visit The Creative Commons Public Domain Dedication waiver ( applies to the data made available in this article, unless otherwise stated in a credit line to the data.

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Burgan, K., McCollum, C.G., Guzman, A. et al. A mixed methods evaluation assessing the feasibility of implementing a PrEP data dashboard in the Southeastern United States. BMC Health Serv Res 24, 101 (2024).

Download citation

  • Received:

  • Accepted:

  • Published:

  • DOI: