McClanahan Kronborg (carbonchalk28)

Communicating uncertainty is an art requiring practice. The purpose of this study was to compare pedagogies for the instruction of pharmacy students in communicating definitive uncertainty. A case scenario featuring a busy physician asking a question without a definitive answer was directed to the pharmacy student using two pedagogies (1) in-person standardized client and (2) virtual written case. Students provided self-assessments of their confidence in communicating uncertainty after completing the case utilizing a survey containing both rating scale questions and open-ended questions. Self-confidence within-group differences were compared using Wilcoxon signed-rank tests and between-group differences were compared using Mann-Whitney U tests. Responses to open-ended questions were descriptively analyzed for themes using qualitative assessment methods. Both the in-person standardized client (70 to 81, P ≤ .001) and the virtual written case (74 to 85, P ≤ .001) significantly increased students' self-rated confidence to verbalize "I don't know" to a healthcare provider. No significant differences were observed between the pedagogies. However, students who participated in the virtual written case mentioned a desire for "additional practice opportunities" more frequently than students who participated in the in-person standardized client. In-person standardized client and virtual written case are effective methods for increasing pharmacy student comfort with communicating definitive uncertainty. Further research is needed to instruct pharmacists in uncertainty communication. In-person standardized client and virtual written case are effective methods for increasing pharmacy student comfort with communicating definitive uncertainty. Further research is needed to instruct pharmacists in uncertainty communication. It is unclear how clinical reasoning is impacted by a single advanced pharmacy practice experience (APPE) and how preceptors can further develop these skills. Students completing an APPE within four sites were invited to participate. To assess clinical reasoning skills, students completed a 30 item script concordance test (SCT) during week 1 and week 5 of a rotation. Students were divided into control and intervention groups. The intervention group participated in a clinical reasoning discussion, during which students presented a case and led a discussion on how to reason through treatment options. Change in mean SCT scores between week 1 and week 5 were 0.84 (2.8%) and 1.23 (4.1%) in the control (n = 15) and intervention groups (n = 28), respectively. There was no significant change in scores in the control group (P = .07, CI -0.34, 2.01). The change in scores was statistically significant in the intervention group (P = .02, CI 0.23, 2.23). An independent samples t-test comparing the SCT score change for the control and intervention group showed no significant difference (P = .62, CI -1.18, 1.96). This study demonstrated the feasibility of implementing a SCT in experiential education. SCT scores did not significantly improve beyond the standard APPE in response to the focused educational intervention, but investigators found that the discussion facilitated rich conversations about patient cases and was valuable for assessing a student's thinking pattern. This study demonstrated the feasibility of implementing a SCT in experiential education. SCT scores did not significantly improve beyond the standard APPE in response to the focused educational intervention, but investigators found that the discussion facilitated rich conversations about patient cases and was valuable for assessing a student's thinking pattern. To describe the design of an application-based calculations review module, determine retention of skills learned in a calculations course, determine student ability to identify information necessary to solve calculations