Understanding Pharmacists' Feedback Comments ...

5 downloads 0 Views 233KB Size Report
Murphy, John P. Pragmatism: From Peirce to Davidson. Boulder: Westview Press, 1990. ... Trochim, William M. K., and James P. Donnelly. Research Methods ...
David Li Tang, France Bouthillier School of Information Studies, McGill University, Montréal, Québec, Canada Pierre Pluye, Roland Grad Department of Family Medicine, McGill University, Montréal, Québec, Canada Carol Repchinsky Canadian Pharmacists Association, Ottawa, Canada

Understanding Pharmacists’ Feedback Comments from the Perspective of a Health Information Provider Abstract: Pharmacists represent a unique group of health information users. This study examines their feedback about a health information resource, and explores the potential usefulness of that feedback from the information provider‟s perspective. It is part of a doctoral dissertation on using health professionals‟ feedback for optimizing health information resources. Résumé : Les pharmaciens représentent un groupe d'utilisateur unique des informations de santé. Cette étude examine leurs commentaires sur une ressource électronique d'information de santé, et explore l'utilité potentielle de ces commentaires du point de vue du fournisseur d'information (rétroaction). Elle fait partie d'une thèse de doctorat sur l'apport des commentaires des professionnels de santé pour améliorer les ressources d'information.

1. Introduction This study is part of an ongoing doctoral project on using health professionals‟ feedback for optimizing health information resources. The project examines written feedback comments submitted by pharmacists with regard to content within e-Therapeutics+®, an online therapeutic information resource, published by the Canadian Pharmacists Association (CPhA). Our project is based on a participatory research approach in collaboration with the CPhA, and on the following assumption: pharmacists‟ feedback comments are useful to the CPhA and should be managed systematically. Before expending effort to develop a formal approach to feedback management, it is important to first (a) gain an in-depth understanding about the feedback comments to be dealt with, and (b) ratify the assumption that the comments are useful to the CPhA. This exploratory study seeks to address two research questions: (1) What types of issues are reported in pharmacists‟ feedback comments? (2) In what way might pharmacists‟ feedback comments be useful to the CPhA? Question 1 aims to create a typology of feedback comments, and question 2 will reveal the areas where the CPhA, the information provider, may benefit from these comments. Our starting point for answering these two questions is practical rather than theoretical, i.e., based on CPhA‟s actual experience in the cognitive and social situation under investigation (Murphy, 1990). We expect to reach a “pragmatical belief” that can provide grounds for subsequent research and action (Kant in Scott, 2009).

1

2. Background Current technology allows concomitant collection of user feedback when the information resource is consulted. For their continuing education, the Canadian Pharmacists Association sends their members highlights from e-Therapeutics+® via weekly emails. The highlight message contains an accessible link to a web form, by which users can provide feedback after it is read. The feedback considered in this study comprises verbatim comments collected via a webbased feedback form, which are rich in meaning. As the primary interest of the CPhA is to optimize the content quality of e-Therapeutics+®, screening criteria have been applied to identify the comments that have constructive value (e.g., revealing the reader was not „convinced‟ by the information). In response to selected constructive feedback comments, changes have been made to the content of e-Therapeutics+® by the information provider. This opens a window for identifying the usefulness of those feedback comments.

3. Methodology Our study follows a generic qualitative research methodology (Caelli, Ray, and Mill 2003), and comprises (a) the secondary analysis of existing feedback data for answering the first question, and (b) interviews with the CPhA staff for answering the second question. The qualitative data analysis employs an inductive thematic analysis procedure (Braun and Clarke 2006). 3.1 Secondary analysis of feedback data Reusing previously collected data originally gathered by others is considered secondary analysis, and its focus is on analyzing rather than collecting data (Trochim and Donnelly 2007). The Data A total of 313 constructive feedback comments were collected in a previous study on the impact of information on therapeutics read by pharmacists (Pluye et al. 2009). The unit of analysis is one short comment (e.g., “Not clear re glucosamine or chondroitin or both.”) on one highlight excerpt from the e-Therapeutics+® database. Data Analysis Latent coding is performed by reading the entire comment in order to interpret the issue (e.g., a problem) perceived by the feedback provider. The aim is to infer the underlying, implicit meaning of the feedback message (e.g., lack of quality in the knowledge conveyed through the information), as that meaning is often not explicitly expressed at the semantic level (e.g., the user questions the validity of the referenced study) (Krippendorff, 1980). Inferred meanings are then categorized to construct a typology (i.e., problematic themes) for question 1. 3.2 Interview and thematic analysis Six CPhA editorial staff have been interviewed, including the Editor-in-Chief and five clinical editors each responsible for a topical area such as “Cardiovascular”. The interviews focused on the feedback handling process which is initiated by the Editor-inChief who reviews all constructive feedback comments and then forwards what needs to be investigated to the respective clinical editors. 2

Data Collection Two types of questions were asked. First, semi-structured questions guided discussions on what happened after a constructive feedback comment was received. Each participant discussed 4-12 comments that were processed by her or himself. Then, open-ended questions were asked to explore the participant‟s perception of the usefulness of feedback comments. The interviews were audio recorded and transcribed for data analysis. Data Analysis The structure of interview questions resulted in two types of data for analysis. First, to discover promising patterns of utilizing the feedback by identifying themes of (a) what was done, and more importantly, (b) what were the beneficial outcomes, if any, of the undertakings? Second, an initial look at the answers about the perceived usefulness of feedback comments showed that the editors‟ reflective thoughts are very succinct and can, therefore, be reported as-is to directly answer question 2 without the need for (redundant) interpretation.

4. Results and discussion The data analyses are still in progress, and some initial findings are presented in this paper as follows.

4.1 Types of reported problems From analyzing the existing constructive feedback comments, nine categories emerged that represent types of reported issues with respect to the information resource:  Missing details  Missing relevant information  Disagreement (different personal opinion stated)  Suggestion of other evidence or issues  Difficulty in understanding  Ineffective content presentation (e.g., verbose or unclear wording)  Impracticability or irrelevance regarding practice  Lack of quality or value in the conveyed knowledge  Usability problems with features of the information resource These categories are not mutually exclusive. For example, suggestion of other evidence and disagreement could concur. In that case, the comment is more informative to the editor as compared to a mere statement of disagreement in opinion. These categories appear to refine and expand prior research findings about problems with health information use (Grad et al. 2007; Pluye et al. 2009; Pluye et al. 2005), while the prior research used different methods to study different information users in a pull context (i.e., clinicians searching for information) as opposed to the push context of the present study (i.e., pharmacists receiving information without looking for it). For the CPhA, knowing exactly what types of problems are reported is the first step in addressing these problems.

3

4.2 Use and perceived usefulness of feedback comments The inductive thematic analysis of a total of 40 units revealed an essential 3-step procedure: read/interpret–investigate/communicate–change/no change (to e-Therapeutics+®), over which some promising patterns and outcomes of feedback utilization have been found pertaining to each of the three steps: (1) The editors are often prompted by the feedback comment to critically re-examine (a) the way information is presented (e.g., the clarity of wording), as well as (b) the validity of knowledge as conveyed in text. In the latter case, an investigation ensues. (2) Then, either the validity of knowledge is confirmed, or its inadequacy is identified. When the validity of knowledge appears questionable, editors would sometimes bring the reported issue to the attention of the original chapter author of e-Therapeutics+® so that it can be addressed in the next version. (3) Once the investigation is complete, corrections can, within two weeks, be reflected in the online version of the information resource. Summary of the perceived usefulness of the feedback comments include:  From the end users‟ perspective, the feedback tells the editors about what information is important (i.e., what the end users need to know most).  Although all published content is peer reviewed, the information provider can through the feedback comments know the end user‟s judgment about the content.  The feedback channel provides an on-going check for potential errors and wording issues. Although not always significant, the refinements to content are uniquely valuable through a continuous process that might not happen with planned major revisions, typically at 2-year intervals.  Knowledgeable users may provide content that editors did not consider or were not updating quickly enough. Thus, the knowledge of end users contributes to extending or updating the knowledge of editors.

5. Conclusion Findings regarding the two research questions can be summarized as follows: (1) The problem typology derived from historical feedback data strongly suggests that addressing what is reported in pharmacists‟ feedback might result in optimized information resource quality. (2) CPhA‟s current practices and editors‟ perceptions indicate that substantial benefits can be attained through effective use of feedback comments. Therefore, the general assumption of the overall research project is positively confirmed. Namely, it is worthwhile to pursue and spend effort on developing a systematic process for the CPhA in order to facilitate the use of feedback comments. Moreover, the stakeholders‟ current practices, perceptions and expectations in regard to user feedback have implications to be considered in subsequent stages of this research project.

4

References Braun, V., and V. Clarke. “Using Thematic Analysis in Psychology.” Qualitative Research in Psychology 3, no. 2 (2006): 77-101. Caelli, K., L. Ray, and J. Mill. “‟Clear as Mud‟: Toward Greater Clarity in Generic Qualitative Research.” The International Journal of Qualitative Methods 2, no. 2 (2003). Grad, R. M., M. Shulha, P. Pluye, J. Johnson-Lafleur, Marie-Eve Beauchamp, A. Macaulay, James Hanley, K. Dalkir, and B. Marlow. “Validation of a Method to Assess the Clinical Impact of Electronic Knowledge Resources.” E - Service Journal 5, no. 2 (2007): 113. Krippendorff, Klaus. Content analysis: An introduction to its methodology. Beverly Hills: Sage Publications, 1980. Murphy, John P. Pragmatism: From Peirce to Davidson. Boulder: Westview Press, 1990. Pluye, P., R. M. Grad, C. Repchinsky, B. Farrell, J. Johnson-Lafleur, T. Bambrick, M. Dawes, et al. “IAM: A Comprehensive and Systematic Information Assessment Method for Electronic Knowledge Resources.” edited by A. Dwivedi. Hershey: IGI Publishing, 2009. Pluye, P., R. M. Grad, R. Stephenson, and L. G. Dunikowski. “A New Impact Assessment Method to Evaluate Knowledge Resources.” AMIA Symposium proceedings (2005): 609-13. Scott, P. J., and J. S. Briggs. “A Pragmatist Argument for Mixed Methodology in Medical Informatics.” Journal of Mixed Methods Research 3, no. 3 (2009): 22341. Trochim, William M. K., and James P. Donnelly. Research Methods Knowledge Base. Mason, Ohio: Thomson Custom Pub., 2007.

5