What’s the Eberly Center reading and thinking about this month?
The Research and Scholarship Digest, published the first Monday of each month, consists of short summaries of recently peer-reviewed studies on teaching and learning topics. This digest offers a view into what we are reading and thinking about at the Eberly Center that:
• adds to our understanding of how students learn
• is potentially generalizable across teaching contexts in higher education
• provokes reflection on implications for our teaching and educational development practices.
We hope the readers of this digest will find it a useful resource for staying in-tune with the rapidly expanding education research literature.
February 2026
Baldock, B. L., & White, G. W. (2026). Group Work Anxiety and STEM Identity: Exploring the Role of Gender, Mental Health, and Neurodevelopmental Differences. Journal of Chemical Education.
This article investigates whether undergraduate STEM majors at a midsized private college with “marginalized identities” (related to gender, racial/ethnic identity, neurodivergence or mental health diagnoses) are more likely to experience group work anxiety or “psychosocial stress” related to group work that might impact their STEM identity or performance in STEM courses with significant collaborative active learning components. All declared STEM majors (n=591) were invited to participate in a survey assessing their attitude and preferences in group work. 155 students completed the survey; 49 of those were contacted the following semester for a follow-up survey. The researchers found significant relationships among group work anxiety and lower STEM identity for female students and for students who reported a mental health diagnosis; lower STEM identity was related to lower academic performance. They suggest that chemical educators should carefully consider the design and implementation of active learning involving group work and take steps to mitigate students’ stress and anxiety. The article concludes with suggestions for reducing the potential negative effects of group work anxiety by leveraging Universal Design of Learning principles, considering group composition, building rapport among teammates, clarifying group roles, accommodating neurodiversity, and incorporating equitable grading practices.
Chouvalova, A., Billings, I., Caraway, A. G., Hoggatt, N., Kim, G. Y., Mehta, A., ... & Limeri, L. B. (2026). Characterizing the content and mechanisms of instructor messages that communicate instructor beliefs about ability to undergraduates. CBE—Life Sciences Education, 25(1), ar6.
Some research has shown that instructors’ implicit beliefs, like growth mindset, can have an impact on students, and can be communicated in numerous ways. This work investigates three implicit instructor beliefs about students’ abilities: mindset (i.e., intelligence can be grown), universality (i.e., the distribution of potential for high ability), and brilliance (i.e., the extent to which talent is required for success). Twenty-four students from a diverse sample of four universities (e.g., a historically Black university, a university with high enrollment of first-generation students) were given semi-structured interviews during which they were asked to interpret a series of survey items that measured what students perceive about instructor beliefs. Students were asked to consider a particular instructor and provide a rating (1-5; agree/disagree scale) for each item and elaborate on why they selected each rating, with further probing questions from the interviewer where needed. Results showed four main themes for the content of instructor messages that communicated these beliefs: 1) affordances for success, 2) goal orientation, 3) distribution of achievement, and 4) attributions for performance, and three mechanisms through which those messages are communicated: 1) statements, 2) actions, and 3) course structure and policies. All themes and all mechanisms were associated with all three types of implicit beliefs, although there were trends and variations that the authors discuss in detail, along with implications for instructional practices that may be beneficial.
Hou, C., Zhu, G., Liu, Y., Sudarshan, V., Chong, J. L. L., Zhang, F. Y., ... & Ong, Y. S. (2026). The Effects of Critical Thinking Intervention on Reliance Behaviors, Problem-solving Quality, and Creativity during Human-Generative AI Collaborative Learning. Computers & Education, 105576.
This study investigated whether a critical thinking intervention could improve students’ reliance behaviors, problem-solving quality, and creativity during human–Generative AI collaborative problem-based learning. The experiment invloved 226 undergraduate students, and the intervention integrated authentic instruction, structured dialogue through role-play, and AI-mediated Socratic peer mentoring, while the comparison group used Generative AI freely. Results showed no significant improvement in students’ self-reported critical thinking, likely due to the short duration of the intervention. However, the intervention significantly reduced students’ thoughtless reliance on AI and direct adoption of AI-generated content. At the group level, students in the intervention condition produced more creative solutions, demonstrating higher originality and idea density, while problem-solving quality showed mixed effects across conditions. Overall, the findings suggest that instructional design plays a critical role in shaping how students use Generative AI, and that fostering critical engagement rather than restricting AI use is the more realistic approach in today's higher education.
Morgan, N., & Jones, M. A. (2025). Student Perceptions on the Use of Journal Clubs for Enhancing Academic Literacy. Active Learning in Higher Education, 14697874251388982.
This pedagogical study explores how journal clubs, a traditionally small-group, research-focused practice, can be redesigned as a scalable, curriculum-embedded active learning strategy to support academic literacy and graduate capital development in large undergraduate cohorts. Conducted at the University of Salford, the research responds to a common transition challenge in higher education: many students enter university underprepared for academic reading and writing, while lecture-based approaches often struggle to engage them effectively.
To scale journal clubs for cohorts of up to 150 students, the researchers transitioned the model from an intimate research setting to a highly structured, technology-supported learning design. Sessions were delivered in a large flat laboratory space rather than tiered lecture halls, enabling many small-group discussions to occur simultaneously. The cohort was subdivided into groups of 6–10 students to preserve psychological safety and peer-to-peer learning, supported by layered facilitation from trained student facilitators, academic staff, and demonstrators rotating between groups.
Digital interactivity was central to scaling. Using Mentimeter, each group submitted discussion summaries that were displayed live across the room, allowing ideas to be shared and synthesized at the cohort level. A tightly timed two-hour roadmap, combining an initial reading period, structured discussion blocks, and whole-cohort feedback, ensured consistency and momentum across groups. Engagement was further strengthened through constructive alignment with an upcoming assessed laboratory report, reinforcing relevance in a large-class context.
The study suggests that journal clubs can operate as a scalable, inclusive alternative to didactic instruction, supporting student confidence, sense of belonging, and academic literacy. The model is also adaptable to online or hybrid delivery using breakout rooms, making it a transferable teaching pattern for institutions seeking active learning at scale.
Oppenheimer, D. M., & Ellefson, M. R. (2024). How to interpret discrepancies in empirical results from educational intervention studies. Scholarship of Teaching and Learning in Psychology.
The Teaching as Research (TAR) landscape often features many studies with seemingly conflicted or contradictory results, which creates a dilemma for instructors who seek to implement evidence-based interventions in their classes. A classic example is a study with a remarkable or impactful finding that is then followed by other studies that find different results. The authors describe a framework of five principles used in intervention science for evaluating efficacy and effectiveness of studies, and discuss the important differences between a direct replication vs a generalization study and how those differences can be interpreted when facing the aforementioned instructional dilemma. The authors also emphasize teaching context as an important variable when deciding whether certain research findings are a good fit for their classroom.
