Student Reasoning in Organic Chemistry
 9781839164910, 9781839167782, 9781839167799

Table of contents :
Front Cover
Student Reasoning in Organic Chemistry
Dedication
Foreword
Preface
Contents
SECTION A
Chapter 1 - Students’ Attention on Curved Arrows While Evaluating the Plausibility of an Organic Mechanistic Step
1.1 Introduction
1.2 Theoretical Framework
1.2.1 Abstractness
1.2.2 Student Reasoning
1.2.3 Eye Tracking
1.3 Research Questions
1.4 Methods
1.4.1 Context and Participants
1.4.2 Data Collection
1.4.3 Data Analysis
1.5 Results and Discussion
1.5.1 Explicit and Implicit Features
1.5.2 Specific and General Terminology
1.5.3 Reasoning Based on Sequence vs. Chaining
1.5.4 AOIs
1.5.5 Success
1.6 Conclusions, Implications, and Limitations
Acknowledgements
References
Chapter 2 - Supporting Spatial Thinking in Organic Chemistry Through Augmented Reality—An Explorative Interview Study
2.1 Introduction
2.1.1 Multiple External Representations in Organic Chemistry Learning
2.1.2 Spatial Reasoning in Organic Chemistry
2.2 Augmented Reality as an Instructional Aid in Organic Chemistry
2.3 Aim of the Study
2.4 Sample and Design
2.5 Results
2.5.1 Task 1—Translation Between a Dash-wedge Notation and a Newman Projection
2.5.2 Task 2—Generating a Newman Projection from a Given Dash-wedge Notation
2.5.3 Task 3—Translating Between Two Ball-and-stick Models
2.5.4 Task 4—Determine the Product Conformation
2.6 Discussion
References
Chapter 3 - Representational Competence Under the Magnifying Glass-The Interplay Between Student Reasoning Skills, Conceptual Understanding, and the Nature of Representations†
3.1 Introduction
3.1.1 The Role of Representational Competence in Organic Chemistry
3.1.2 The Interplay Between the Nature of Representations, Conceptual Understanding, and Reasoning
3.2 Study Design and Methods
3.3 Findings
3.3.1 Students' Reasoning While Interpreting Dash-wedge Diagrams and Newman Projections
3.3.1.1 Description of the Interpretation Tasks
3.3.1.2 Interpreting Dash-wedge Diagrams
3.3.1.3 Interpreting Newman Projections
3.3.2 Students' Reasoning While Translating Between Dash-wedge Diagrams and Newman Projections
3.3.2.1 Description of the Translation Tasks
3.3.2.2 Translating from Dash-wedge Diagrams to Newman Projections
3.3.2.3 Translating from Newman Projections to Dash-wedge Diagrams
3.3.3 Students' Reasoning While Generating a Newman Projection from a Dash-wedge Diagram
3.3.3.1 Description of the Generation Task
3.3.3.2 Generating a Newman Projection from a Dash-wedge Diagram
3.3.4 Students' Reasoning While Using Newman Projections to Make Inferences About Stability
3.3.4.1 Description of the Use Tasks
3.3.4.2 Using Newman Projections to Make Inferences About Stability
3.4 Summary of Findings and Conclusions
3.4.1 Summary of Findings Across the Tasks that Focused on Various Representational Competence Skills
3.4.2 Summary of Findings for Each Representative Student
3.4.3 Conclusions
3.5 Implications
3.5.1 Implications for Instruction
3.5.2 Implications for Research
Acknowledgements
References
SECTION B
Chapter 4 - Fostering Causal Mechanistic Reasoning as a Means of Modelling in Organic Chemistry
4.1 Introduction
4.2 Causal Mechanistic Reasoning Underpins Expert-like Modeling
4.3 Characterizing Causal Mechanistic Reasoning Across Different Reactions
4.4 Eliciting Causal Mechanistic Reasoning—Attention to Scaffolding
4.5 Causal Mechanistic Reasoning in Organic Chemistry
4.6 Characterizing the Relationship Between Reasoning and Arrow Drawings
4.7 Summary
4.8 Strategies for Fostering Causal Mechanistic Reasoning in Learning Environments
Acknowledgements
References
Chapter 5 - Students’ Reasoning in Chemistry Arguments and Designing Resources Using Constructive Alignment
5.1 Introduction
5.1.1 Citizens Need to be Able to Reason with Scientific Evidence
5.2 Framework—Reasoning, Granularity, and Comparisons
5.2.1 Modes of Reasoning
5.2.2 Levels of Granularity—Moving Between Grain Sizes
5.2.3 Comparison—Considering Alternatives
5.3 Students’ Arguments Can Vary Between Tasks
5.4 Supporting Student Learning Through Constructive Alignment
5.4.1 Instructional Design
5.4.2 Scaffolding Skill Development
5.4.3 Resources for Constructively Aligning Reasoning into a Course
5.5 Conclusions
References
Chapter 6 - From Free Association to Goal-directed Problem-solving-Network Analysis of Students' Use of Chemical Concepts in Mechanistic Reasoning†
6.1 Introduction
6.2 Theoretical Background
6.2.1 Reasons for Students’ Difficulties with Mechanistic Reasoning
6.2.2 Organization of Knowledge Structure Through Cognitive Networks
6.3 Research Questions
6.4 Method
6.4.1 Cohort
6.4.2 Case Comparison Tasks
6.4.3 Data Collection and Analysis
6.5 Results
6.6 Discussion and Conclusions
6.6.1 Implications for Teaching
Acknowledgements
References
Chapter 7 - Epistemic Stances in Action—Students’ Reasoning Process While Reflecting About Alternative Reaction Pathways in Organic Chemistry
7.1 Introduction
7.1.1 Reasoning in Students’ Argumentation
7.1.2 Toward an Understanding of Epistemic Stances
7.2 Research Questions
7.3 Study Design and Methods
7.3.1 Data Analysis
7.4 Results and Discussion
7.4.1 Case 1—Taylor
7.4.2 Case 2—Robin
7.5 Conclusion and Implications
Acknowledgements
References
Chapter 8 - How Do Students Reason WhenThey Have to Describe the“What” and “Why” of a GivenReaction Mechanism?†
8.1 Introduction
8.2 Theoretical Background—Mechanistic Reasoning and Writing-to-learn in Organic Chemistry
8.3 Research Questions
8.4 Methods
8.4.1 The Course “Training OC”
8.4.2 Sample
8.4.3 The Coding Process
8.5 Results and Discussion
8.5.1 RQ1: What is the Quality of Students’ Reasoning Regarding Their Description of the “What” of the Given Reaction Mechanism
8.5.1.1 Properties of Entities
8.5.1.2 Activities of Entities
8.5.2 RQ2: What is the Quality of Students’ Reasoning Regarding Their Description of the “Why” of the Given Reaction Mechanism
8.5.2.1 Charges
8.5.2.2 Bonding
8.5.2.3 Brønsted
8.5.2.4 Nucleophile–Electrophile
8.6 Limitations
8.7 Implications
Acknowledgements
References
Chapter 9 - In-the-moment Learning of Organic Chemistry During Interactive Lectures Through the Lens of Practical Epistemology Analysis
9.1 Introduction
9.1.1 Practical Epistemology Analysis (PEA)
9.2 Methodology
9.2.1 Study Context
9.2.2 Data Collection
9.2.3 Data Analysis
9.3 Results and Discussion
9.3.1 What Drives Student In-the-moment Learning—Gap Patterns
9.3.1.1 Pattern 1
9.3.1.2 Pattern 2
9.3.2 How Students Learn In-the-moment of Group Discussions—Relation Patterns
9.4 Conclusions and Implications
Acknowledgements
References
SECTION C
Chapter 10 - Flipped Classrooms in Organic Chemistry—A Closer Look at Student Reasoning Through Discourse Analysis of a Group Activity
10.1 Introduction
10.1.1 Pre-class Activity—Videos
10.1.2 Pre-class Activity—Quizzes
10.1.3 In-class Activity—Student Response Systems
10.1.4 In-class Activity–Group Work
10.2 Student Dialogue in a Flipped Course—A Case Study
10.2.1 The ICAP Framework
10.2.2 Argumentation and Student Reasoning in Organic Chemistry
10.2.3 Course Context and Participants
10.2.4 Group Quiz Format
10.2.5 Data Collection and Analysis
10.3 Findings
10.3.1 Group A Summary
10.3.2 Quiz 2, Prompt 5—Group B
10.3.3 ICAP Analysis—Comparison of Group A to Group B
10.3.4 Argumentation—Comparison of Group A to Group B
10.4 Conclusions and Implications
10.4.1 Scaffolding Questions to Promote Argumentation
10.4.2 Group Composition and Roles
10.4.3 Incorporating Student Observations in Assessment of Group Activities
Acknowledgements
References
Chapter 11 - Systemic Assessment Questions as a Means of Assessment in Organic Chemistry
11.1 Introduction
11.2 The Role of Scientific Reasoning Skills in Developing Meaningful Understanding in Organic Chemistry
11.3 Assessment of Students’ Meaningful Understanding in the Context of SATL
11.3.1 Systemic Diagrams and Systemic Assessment Questions
11.3.2 Assessment of SAQs
11.4 Research on Systemic Diagrams in Organic Chemistry Education
11.5 Example of an Activity to Assess Students Meaningful Understanding with SAQs Diagrams in Organic Chemistry Lessons
11.6 Conclusions and Implications
References
Chapter 12 - Variations in the Teaching of
Resonance—An Exploration of
Organic Chemistry Instructors’
Enacted Pedagogical Content
Knowledge†
12.1 Introduction
12.2 Theoretical Framework
12.2.1 PCK in the Sciences
12.2.2 Coming to a Consensus on PCK
12.2.3 Tying It All Together
12.3 Methods
12.3.1 Participants
12.3.2 Data Collection
12.3.3 Data Analysis
12.4 Results
12.4.1 Grouping Instructors by ePCK
12.4.2 Integrating ePCK Components
12.4.3 Student Conceptions of the Resonance Hybrid
12.5 Discussion
12.5.1 RQ1—Characterizing Instructors’ ePCK
12.5.2 RQ2—Instructor ePCK and Student Outcomes
12.6 Limitations
12.7 Conclusions and Implications
Acknowledgements
References
Chapter 13 - Investigation of Students’ Conceptual Understanding in Organic Chemistry Through Systemic Synthesis Questions
13.1 Introduction—Conceptual Understanding in Organic Chemistry
13.2 Theoretical Foundation
13.2.1 Organic Reaction Mechanism Problems and Mechanistic Reasoning
13.2.2 Mental Models and Conceptual Models
13.2.3 Systemic Diagrams and Systemic Assessment Questions as Effective Conceptual Models
13.3 Assessing the Quality of Students’ Mental Models and/or Conceptual Structures in Organic Chemistry
13.3.1 Research Problem, Objectives and Tasks
13.3.2 Description of Scoring Scheme Applied to the Students’ Generated SSynQs and Obtained Results
13.4 Concluding Remarks and Implications for Instruction
Acknowledgements
References
Chapter 14 - Disciplining Perception Spatial Thinking in Organic Chemistry Through Embodied Actions
14.1 Introduction
14.1.1 Perceptual Learning with Visual Representations
14.1.2 Disciplining Perception Through Embodied Actions
14.2 Present Study
14.2.1 Methods
14.2.2 Case 1—Making the Steps for Spatial Thinking Visible
14.2.2.1 Action 1. Betty Directs Attention to Spatial Information
14.2.2.2 Action 2. Betty Performs a Perceptual Stance
14.2.2.3 Action 3. Betty Physically Represents 3D Information
14.2.2.4 Summary of Case 1
14.2.3 Case 2—Performing Spatial Thinking in a Large Lecture Hall
14.2.3.1 Action 1. Mike Directs Attention to Spatial Information
14.2.3.2 Action 2. Mike Performs a Perceptual Stance
14.2.3.3 Action 3. Mike Physically Represents 3D Information
14.2.3.4 Summary of Case 2
14.2.4 Cross-case Analysis
14.3 Conclusion
Acknowledgements
References
Chapter 15 - Building Bridges Between
Tasks and Flasks—Design of a
Coherent Experiment-supported
Learning Environment for
Deep Reasoning in Organic
Chemistry†
15.1 Introduction
15.2 State of Research and Approach to Design
15.2.1 Research on Student Reasoning
15.2.2 Design Objectives and Design Principles
15.2.3 Aggregation and Arrangement of Reaction Mechanisms and Concepts in a Coherent Learning Environment
15.3 Developments for Secondary and Tertiary Education
15.3.1 Secondary Education: Learning to Think in Mechanistic Alternatives—SN1 vs. E1 Reactions
15.3.2 Tertiary Education: Exploring Electronic Substituent Effects—Alkaline Hydrolysis of Substituted Ethyl Benzoates
15.4 Implications for Implementation and Teaching
15.5 Conclusion
Acknowledgements
References
SECTION D
Chapter 16 - Assessment of Assessment in Organic Chemistry—Review and Analysis of Predominant Problem Types Related to Reactions and Mechanisms
16.1 Introduction
16.1.1 Chapter Scope
16.2 Individual Reactions
16.3 Synthesis
16.3.1 Student Solutions to Traditional Synthesis Tasks
16.3.2 Non-traditional Assessment of Synthesis
16.4 Electron-pushing Mechanisms (EPMs)
16.4.1 Traditional Electron-pushing Tasks
16.4.2 Non-traditional Mechanistic Reasoning Tasks
16.5 Conclusions
Acknowledgements
References
Chapter 17 - Developing Machine Learning Models for Automated Analysis of Organic Chemistry Students’ Written Descriptions of Organic Reaction Mechanisms
17.1 Introduction
17.1.1 Eliciting Students’ Mechanistic Reasoning in Organic Chemistry Through Writing
17.1.2 Machine Learning for Analyzing Student Writing in Chemistry
17.2 Theoretical Framework
17.3 Research Questions
17.4 Methods
17.4.1 Setting and Participants
17.4.2 Writing-to-learn Assignments and Implementation
17.4.3 Data Collection
17.4.4 Data Analysis
17.4.4.1 Analytical Framework
17.4.4.2 Reliability
17.4.4.3 Development of Automated Text Analysis Models
17.5 Results and Discussion
17.5.1 RQ1—How do Students Respond to WTL Assignments Intended to Elicit How and Why Organic Reaction Mechanisms Occur
17.5.2 RQ2—Does Automated Text Analysis Allow for Predictions of the Components Included in Students’ Written Mechanistic Descrip...
17.6 Implications
17.6.1 Implications for Research
17.6.2 Implications for Practice
17.7 Limitations
17.8 Conclusions
References
Chapter 18 - Development of a Generalizable Framework for Machine Learning-based Evaluation of Written Explanations of Reaction Mechanisms from the Post-secondary Organic Chemistry Curriculum
18.1 Are Drawn Reaction Mechanisms Enough to Evaluate Understanding
18.2 Learner Understanding of Reaction Mechanisms
18.3 Assessment of Learner Understanding of Reaction Mechanisms
18.4 Training Machine Learning Models for Automated Text Analysis
18.5 Framework for Evaluating Understanding of Reaction Mechanisms
18.5.1 Levels of Explanation Sophistication
18.5.2 Evaluating Understanding of Electrophiles
18.6 Implications for Educators
18.7 Implications for Researchers
18.8 A Path toward Better Learning
Acknowledgements
References
Chapter 19 - The Central Importance of
Assessing “Doing Science” to
Research and Instruction†
19.1 Introduction
19.2 Assessment 101
19.2.1 Observation
19.2.2 Interpretation
19.2.3 Conceptual Change
19.2.4 How Observation, Interpretation, and Cognition Work Together
19.3 Assessing Work Aligned with the Practice of Chemistry
19.4 3D Assessments as Research Tools
19.5 3D Assessments as a Vital Part of 3D Learning Environments
19.6 Future Directions for Research on 3D Assessments
19.7 Conclusion
Acknowledgements
References
Postface
Biographies of Authors
Subject Index

Citation preview

Student Reasoning in Organic Chemistry

Research Advances and Evidence-based Instructional Practices

Advances in Chemistry Education Series Editor-in-chief:

Keith S. Taber, University of Cambridge, UK

Series editors:

Avi Hofstein, Weizmann Institute of Science, Israel Vicente Talanquer, University of Arizona, USA David Treagust, Curtin University, Australia

Editorial Advisory Board:

Mei-Hung Chiu, National Taiwan Normal University, Taiwan, Rosaria da Silva Justi, Universidade Federal de Minas Gerais, Brazil, Onno De Jong, Utrecht University, Netherlands, Ingo Eilks, University of Bremen, Germany, Murat Kahveci, Çanakkale Onsekiz Mart University, Turkey, Vanessa Kind, Durham University, UK, Stacey Lowery Bretz, Miami University, USA, Hannah Sevian, University of Massachusetts Boston, USA, Daniel Tan, Nanyang Technological University, Singapore, Marcy Towns, Purdue University, USA, Georgios Tsaparlis, University of Ioannina, Greece.

Titles in the Series:

1: Professional Development of Chemistry Teachers: Theory and Practice 2: Argumentation in Chemistry Education: Research, Policy and Practice 3: The Nature of the Chemical Concept: Re-constructing Chemical Knowledge in Teaching and Learning 4: Creative Chemists: Strategies for Teaching and Learning 5: Engaging Learners with Chemistry: Projects to Stimulate Interest and Participation 6: The Johnstone Triangle: The Key to Understanding Chemistry 7: Problems and Problem Solving in Chemistry Education 8: Teaching and Learning in the School Chemistry Laboratory 9: Nanochemistry for Chemistry Educators 10: Student Reasoning in Organic Chemistry: Research Advances and Evidence-based Instructional Practices

How to obtain future titles on publication:

A standing order plan is available for this series. A standing order will bring delivery of each new volume immediately on publication.

For further information please contact:

Book Sales Department, Royal Society of Chemistry, Thomas Graham House, Science Park, Milton Road, Cambridge, CB4 0WF, UK Telephone: +44 (0)1223 420066, Fax: +44 (0)1223 420247 Email: [email protected] Visit our website at www.rsc.org/books

Student Reasoning in Organic Chemistry By

Nicole Graulich

Justus-Liebig-University Gießen, Germany Email: [email protected] and

Ginger Shultz

University of Michigan, USA Email: [email protected]

Advances in Chemistry Education Series No. 10 Print ISBN: 978-1-83916-491-0 PDF ISBN: 978-1-83916-778-2 EPUB ISBN: 978-1-83916-779-9 Print ISSN: 2056-9335 Electronic ISSN: 2056-9343 A catalogue record for this book is available from the British Library © Royal Society of Chemistry 2023 All rights reserved Apart from fair dealing for the purposes of research for non-commercial purposes or for private study, criticism or review, as permitted under the Copyright, Designs and Patents Act 1988 and the Copyright and Related Rights Regulations 2003, this publication may not be reproduced, stored or transmitted, in any form or by any means, without the prior permission in writing of The Royal Society of Chemistry or the copyright owner, or in the case of reproduction in accordance with the terms of licences issued by the Copyright Licensing Agency in the UK, or in accordance with the terms of the licences issued by the appropriate Reproduction Rights Organization outside the UK. Enquiries concerning reproduction outside the terms stated here should be sent to The Royal Society of Chemistry at the address printed on this page. Whilst this material has been produced with all due care, The Royal Society of Chemistry cannot be held responsible or liable for its accuracy and completeness, nor for any consequences arising from any errors or the use of the information contained in this publication. The publication of advertisements does not constitute any endorsement by The Royal Society of Chemistry or Authors of any products advertised. The views and opinions advanced by contributors do not necessarily reflect those of The Royal Society of Chemistry which shall not be liable for any resulting loss or damage arising as a result of reliance upon this material. The Royal Society of Chemistry is a charity, registered in England and Wales, Number 207890, and a company incorporated in England by Royal Charter (Registered No. RC000524), registered office: Burlington House, Piccadilly, London W1J 0BA, UK, Telephone: +44 (0) 20 7437 8656. For further information see our website at www.rsc.org Printed in the United Kingdom by CPI Group (UK) Ltd, Croydon, CR0 4YY, UK

Dedication Dedicated to the memory of Prof. George Bodner (08 March 1946–19 March 2021), Purdue University, who inspired so many over decades in their love for organic chemistry education.

  Advances in Chemistry Education Series No. 10 Student Reasoning in Organic Chemistry: Research Advances and Evidence-based Instructional Practices Edited by Nicole Graulich and Ginger Shultz © The Royal Society of Chemistry 2023 Published by the Royal Society of Chemistry, www.rsc.org

v

Foreword “How do we know what we know?” sounds like a trivial question but it is not. To differentiate between justified belief and opinion has never been more important in the sciences, and especially in the public perception of science in general. This epistemology is one of the pillars upon which science and all rational action and reasoning is built upon. As the title Student Reasoning in Organic Chemistry reveals, the present book edited by Nicole Graulich and Ginger Shultz analyses the teaching of organic chemistry with the goal to improve chemistry instruction. It therefore fills a long apparent, but never systematically addressed, knowledge gap regarding the outcome of our instructional efforts by highlighting current advances in chemistry education research. From a very personal perspective, I realize that when I draw an organic reaction mechanism, I hope the students will follow my drawings and my words and it all makes sense. Sometimes I realize that I am not quite sure whether the things I draw are simply memorised to understand the outcome of a reaction—sort of a mnemonic trick—or, if I have reasons to believe that this particular mechanistic hypothesis is close to some “scientific reality”. How hard is it to accept that an electron pair is a curved arrow from which most of the action originates? Can the students visualize chemical structures in three dimensions from some simple stick drawings that are in all honesty quite a leap of faith? Will different representation styles add to the students’ comprehension or confusion? There are many more questions of this sort that are highly relevant but often not discussed. I am pleased to see that the present book picks up where these questions leave me. I’ve spoken to colleagues about this and many feel similarly. Still, there are no systematic efforts to determine whether the applied teaching methods, the language, or the pictorial framework ultimately leads to a deep   Advances in Chemistry Education Series No. 10 Student Reasoning in Organic Chemistry: Research Advances and Evidence-based Instructional Practices Edited by Nicole Graulich and Ginger Shultz © The Royal Society of Chemistry 2023 Published by the Royal Society of Chemistry, www.rsc.org

vii

viii

Foreword

understanding of organic reaction mechanisms that would allow the students to reason rationally about the outcome of a reaction that they have not yet seen before. I am quite impressed that the present book tackles these issues with a very fresh look, often including very modern assessment tools of students as well as teacher habits and performances. As I teach organic reaction mechanisms myself, this book comes quite handy, and I complement the Editors on their efforts to put such a valuable resource together. I am sure you will benefit from this collection of excellent articles just as much and hope you enjoy the reading! Dr Peter R. Schreiner Liebig-Chair and Professor of Organic Chemistry Justus Liebig University Giessen, Germany President, German Chemical Society (GDCh, 2020 & 2021)

Preface Eliciting, Supporting, and Assessing Reasoning Reasoning is a critical aspect of chemistry that students must learn to become full participants in the discipline. However, reasoning is inherently complex and science education researchers have struggled to consistently define and, consequently, to capture patterns of reasoning in student discourse and coursework. We seek to understand reasoning to move beyond the teaching of chemistry as a disconnected set of ideas and to improve the outcomes of chemistry instruction. In chemistry broadly, we have examined various types of reasoning that are distinctive of our discipline such as mathematical reasoning,1,2 reasoning with chemical representations,3,4 mechanistic reasoning,5–7 and argumentation.8–10 Organic chemistry as a subfield of chemistry requires reasoning that is distinct from other subfields. We know that typical organic chemistry instruction often fails to effectively promote reasoning and relies on assessments that emphasize rote memorization rather than an understanding of underlying phenomena. This recognition has led to an emerging focus on reasoning in organic chemistry that is captured in the chapters of this book including studies examining the role of representations, in-depth studies of complex reasoning, classroom teaching practices designed to promote reasoning, and approaches to the formative assessment of reasoning. The first section of the book covers research on student representational competence. Understanding and interpreting visual complex representations is a primary challenge in organic chemistry learning. The contributions highlight how using representations in learning are perceived and

  Advances in Chemistry Education Series No. 10 Student Reasoning in Organic Chemistry: Research Advances and Evidence-based Instructional Practices Edited by Nicole Graulich and Ginger Shultz © The Royal Society of Chemistry 2023 Published by the Royal Society of Chemistry, www.rsc.org

ix

x

Preface

understood by students. While solving organic chemistry tasks students often struggle to understand the underlying meaning of reaction mechanisms and electron-pushing formalism. Little is known about what students visually focus on while evaluating the plausibility of reaction mechanism and how that relates to the types of features they discuss in their reasoning. Weinrich & Britt make use of eye-tracking, followed by think-aloud interviews, to examine how students visually focus on curved arrows and how this relates to the students’ mentions of explicit and implicit features of the given representation. The study demonstrates eye-tracking technology as a valuable tool to examine how a feature (e.g., curved arrows) is focused on while evaluating reaction mechanisms and how these insights can be used to derive implications to support students’ mechanistic reasoning [Section A, Chapter 1]. Keller & Habig qualitatively investigate if students working on tasks involving stereochemistry and pericyclic reactions with augmented-reality support incorporated spatial aspects into their explanations compared to those working without augmented-reality [Section A, Chapter 2]. Learners who used the augmented-reality app tend to be able to involve spatial aspects in their reasoning more often and were able to conduct rotation operations correctly compared to the control group. The work by Ward, Rotich, Hoang & Popova draw on the seminal work from Kozma and Russel (1997) and combined it with Schönborn and Anderson’s model (2008) to characterize how organic chemistry students interpret, translate, generate, and use dash-wedge diagrams and Newman projections [Section A, Chapter 3]. This work demonstrates that the appropriateness of student reasoning can vary across tasks focusing on different representations, different representational competence skills, and whether the student attends to the external features or the conceptual information embedded in the representation. The second selection of contributions highlights approaches to describe student reasoning about reaction mechanisms expressed in written or verbal contexts. Various frameworks have been used in the last decades that cover aspects of causality and mechanistic reasoning. The chapters in this section illustrate an in-depth analysis of students being engaged with organic chemistry tasks, argumentation, and in-class discourse. Crandell & Cooper provide a synthesis of their research on causal mechanistic reasoning as they have applied it, a description of evidence-based strategies used to engage students in reasoning and modelling, and findings from two longitudinal studies [Section B, Chapter 4]. The longitudinal studies both compare students in traditional introductory courses to those in CLUE courses, designed to elicit causal mechanistic reasoning. Their findings speak to the importance of instruction that is designed to elicit reasoning. Deng, Carle & Flynn illustrate an argumentation framework focusing on reasoning, granularity, and comparisons to characterize students’ arguments in organic chemistry and make use of a constructive alignment approach to guide teaching and assessment [Section B, Chapter 5]. The third chapter by Asmussen, Rodemer, Eckhard & Bernholt investigates undergraduate students’ verbal explanations to a series of case comparisons on nucleophilic substitution reactions

Preface

xi

and analyses how different concepts were used and related in students’ argumentation [Section B, Chapter 6]. The categorized concepts are transformed into weighted networks to capture the prevalence and centrality of individual concepts across students and tasks and are further compared to sample solutions. Their study illustrates that students experience difficulties when selecting the appropriate concepts relevant for a task at hand, often relying on single concepts, when multiple ones were required. Lieber & Graulich use a detailed process-oriented lens on students’ problem-solving to elicit students’ reasoning processes and their experiences during these processes [Section B, Chapter 7]. By explicitly examining students’ expression of epistemic stances, they could describe how these stances influence students’ judgements on claims and the justification with evidence and reasoning. Two case descriptions of students are used to illustrate (1) how epistemic stances and argument components are linked in students’ reasoning processes and (2) how epistemic stances are related to turning points. Hermans & Keller qualitatively analyze how writing comic captions for single mechanistic steps can engage students in describing the how, what, and why of organic mechanisms [Section B, Chapter 8]. They document that students were mostly focusing on describing the what and how and often neglected to provide causal relationships, describing the why of processes. Walsh, Karch & Caspari-Gnann use practical epistemological analysis to explore how students’ reason about organic chemistry problems and learn in-the-moment in natural settings [Section B, Chapter 9]. Applying practical epistemological analysis (PEA) allows describing how gaps in students’ understanding can be characterized, filled during interaction and how this is related to students’ prior knowledge. They analyze video recording of online active learning sequences facilitated by learning assistants and show how PEA is used as a tool to make student learning during collaborative group discussion visible. The contributions on classroom practices and student reasoning report about empirical investigation of instructional practices in organic chemistry, either large-class interventions and targeted intervention to support specific aspects of student reasoning. The contributions in this chapter highlight influences of additional factors on student reasoning in organic chemistry, such as the pedagogical content knowledge of teaching faculty and teaching assistants. Traditional teaching methods, particularly those that emphasize lecturing, may not be sufficient to promote student reasoning in organic chemistry. How we design organic chemistry learning environments and what we do as instructors matters. In this section, the authors explore classroom environments and teaching practices designed to promote learning and reasoning in organic chemistry. Learning environments that engage students in discussion, drawing, and writing offer the opportunity for students to practice reasoning. Mooring, Burrows & Gamage examine the influence of a flipped-classroom approach on students’ reasoning in organic chemistry [Section C, Chapter 10]. They present a case study of students’ reasoning as they work together on a group quiz activity. Observing students during such group activities may provide unique insight into how learning environments

xii

Preface

may be designed to support the development of reasoning. Trabert, Schmitt & Schween describe an experiment-based learning environment designed to foster students’ causal mechanistic reasoning [Section C, Chapter 15]. Students engage in experiment-based case comparison activities purposefully designed to help them reason using experimental evidence in the classroom laboratory environment. Few studies have examined reasoning during organic chemistry lab work, and further research is needed to understand how instructors can relate laboratory coursework to the theory learned in lecture courses. Two chapters in this section explore the use of the Systemic Approach to Teaching and Learning (SATL) to promote reasoning. Sendur provides an overview of SATL and multiple examples of how systemic diagrams and assessment questions can be applied in organic chemistry [Section C, Chapter 11]. Rončević, Rodić & Horvat extend this by focusing on the application of SALT to diagnose conceptual understanding related to students’ reasoning about organic reaction mechanisms [Section C, Chapter 13]. Focusing specifically on teaching, Atieh, Mitchell-Jones, Xue & Stains examine instructors enacted pedagogical content knowledge for teaching resonance in organic chemistry [Section C, Chapter 12]. Their interview study describes how seven organic chemistry instructors plan for teaching, teach, and reflect on teaching resonance. Participating instructors demonstrated a range of knowledge and were organised by the authors into three groups characterizing their knowledge of curriculum, knowledge of students, knowledge of instructional strategies as they relate specifically to the teaching of resonance in organic chemistry. Finally, Stieff, Scopelitis & Lira present a study of how instructors use embodied actions—their use of hands and bodies during instruction—to model spatial thinking in organic chemistry [Section C, Chapter 14]. In this qualitative investigation, the authors describe how instructors used similar gestures to demonstrate how to perceive spatial information and how their actions differed in response to their teaching environment. We need to know more about how instruction in organic chemistry occurs and its relationship to student learning and the development of reasoning. The third set of contributions report about novel ways of assessing student reasoning, including automated text-analysis, and underscore the importance of assessments that can measure deeper understandings associated with expert mechanistic reasoning. Bhattacharyya reviewed various studies in organic chemistry education which compared typical problems used by instructors to assess students’ reasoning with purposefully designed variations [Section D, Chapter 16]. He illustrates that most of the problems typically used may not adequately assess students’ reasoning and may rather emphasize the use of heuristics or product-oriented approaches. The following chapters provide novel approaches for assessment as well as the use of machine learning to facilitate the assessment. The chapter by Watts, Dood & Shultz, as well as Raker, Yik & Dood illustrate the start to developing education resources that utilize machine learning technology to assess reasoning and other learning objectives in organic chemistry. Watts, Dood & Shultz describe a machine

Preface

xiii

learning model that was applied to longer pieces of writing about three different introductory organic reaction mechanisms [Section D, Chapter 17]. The model successfully predicts most components of reasoning characterized by the Russ framework, except for causal mechanistic reasoning. Raker, Yik & Dood adds to this topic by describing a generalizable framework for evaluating mechanistic reasoning in student writing [Section D, Chapter 18]. The framework provides an assessment that can differentiate between different aspects of a reaction mechanism (i.e., nucleophiles, proton transfer etc.) and thus could be used by educators to formatively assess a variety of reactions. This framework has the potential to be used to create predictive models that can be used to analyze texts describing a variety of chemical reactions. Schwarz, DeGlopper, Ellison, Esselman & Stowe close this selection of chapters on assessment by providing an overview of assessment in chemistry more generally and assessment associated with practice of chemistry and describe two specific assessments they designed to measure three-dimensional (3D) learning in organic chemistry [Section D, Chapter 19]. 3D learning here includes knowledge of fundamental ideas, the practice of science, and understanding of cross cutting concepts. A summarizing editorial highlights the advances illustrated in this book on researching student reasoning in organic chemistry and opens the gaps and future directions for research. Nicole Graulich Ginger Shultz

References 1. K. Bain, J.-M. G. Rodriguez and M. H. Towns, J. Chem. Educ., 2019, 96, 2086–2096. 2. J.-M. G. Rodriguez and M. H. Towns, Chem. Educ. Res. Pract., 2019, 20, 428–442. 3. A. Kraft, A. M. Strickland and G. Bhattacharyya, Chem. Educ. Res. Pract., 2010, 11, 281–292. 4. N. Becker, C. Rasmussen, G. Sweeney, M. Wawro, M. Towns and R. Cole, Chem. Educ. Res. Pract., 2012, 14, 81–94. 5. M. M. Cooper, H. Kouyoumdjian and S. M. Underwood, J. Chem. Educ., 2016, 93, 1703–1712. 6. I. Caspari, D. Kranz and N. Graulich, Chem. Educ. Res. Pract., 2018, 19, 1117–1141. 7. F. M. Watts, J. A. Schmidt-McCormack, C. A. Wilhelm, A. Karlin, A. Sattar, B. C. Thompson, A. R. Gere and G. V. Shultz, Chem. Educ. Res. Pract., 2020, 21(4), 1148–1172. 8. A. Moon, C. Stanford, R. Cole and M. Towns, J. Res. Sci. Teach., 2017, 54, 1322–1346. 9. J. M. Deng and A. B. Flynn, Chem. Educ. Res. Pract., 2021, 22(3), 749–771. 10. L. Lieber and N. Graulich, Chem. Educ. Res. Pract., 2022, 23, 38–54.

Contents Chapter 1 S  tudents’ Attention on Curved Arrows While Evaluating the Plausibility of an Organic Mechanistic Step  Melissa Weinrich and Ryan Britt

1.1 Introduction  1.2 Theoretical Framework  1.2.1 Abstractness  1.2.2 Student Reasoning  1.2.3 Eye Tracking  1.3 Research Questions  1.4 Methods  1.4.1 Context and Participants  1.4.2 Data Collection  1.4.3 Data Analysis  1.5 Results and Discussion  1.5.1 Explicit and Implicit Features  1.5.2 Specific and General Terminology  1.5.3 Reasoning Based on Sequence vs. Chaining  1.5.4 AOIs  1.5.5 Success  1.6 Conclusions, Implications, and Limitations  Acknowledgements  References 

3 3 5 5 6 7 7 7 7 8 8 11 11 12 12 13 14 16 17 17

 Advances in Chemistry Education Series No. 10 Student Reasoning in Organic Chemistry: Research Advances and Evidence-based Instructional Practices Edited by Nicole Graulich and Ginger Shultz © The Royal Society of Chemistry 2023 Published by the Royal Society of Chemistry, www.rsc.org

xv



xvi

Chapter 2 S  upporting Spatial Thinking in Organic Chemistry Through Augmented Reality—An Explorative Interview Study  Sebastian Keller and Sebastian Habig

2.1 Introduction  2.1.1 Multiple External Representations in Organic Chemistry Learning  2.1.2 Spatial Reasoning in Organic Chemistry  2.2 Augmented Reality as an Instructional Aid in Organic Chemistry  2.3 Aim of the Study  2.4 Sample and Design  2.5 Results  2.5.1 Task 1—Translation Between a Dash-wedge Notation and a Newman Projection  2.5.2 Task 2—Generating a Newman Projection from a Given Dash-wedge Notation  2.5.3 Task 3—Translating Between Two Ball-and- stick Models  2.5.4 Task 4—Determine the Product Conformation  2.6 Discussion  References 

Chapter 3 R  epresentational Competence Under the Magnifying Glass—The Interplay Between Student Reasoning Skills, Conceptual Understanding, and the Nature of Representations  Lyniesha W. Ward, Fridah Rotich, Julia Hoang and Maia Popova

3.1 Introduction  3.1.1 The Role of Representational Competence in Organic Chemistry  3.1.2 The Interplay Between the Nature of Representations, Conceptual Understanding, and Reasoning  3.2 Study Design and Methods  3.3 Findings  3.3.1 Students' Reasoning While Interpreting Dash-wedge Diagrams and Newman Projections  3.3.2 Students' Reasoning While Translating Between Dash-wedge Diagrams and Newman Projections  3.3.3 Students' Reasoning While Generating a Newman Projection from a Dash-wedge Diagram 

19 19 19 21 22 24 24 26 27 27 28 31 32 33

36

36 36 37 39 40 41 42 48





xvii

3.3.4 Students' Reasoning While Using Newman Projections to Make Inferences About Stability  3.4 Summary of Findings and Conclusions  3.4.1 Summary of Findings Across the Tasks that Focused on Various Representational Competence Skills  3.4.2 Summary of Findings for Each Representative Student  3.4.3 Conclusions  3.5 Implications  3.5.1 Implications for Instruction  3.5.2 Implications for Research  Acknowledgements  References 

Chapter 4 F  ostering Causal Mechanistic Reasoning as a Means of Modelling in Organic Chemistry  Olivia M. Crandell and Melanie M. Cooper

4.1 Introduction  4.2 Causal Mechanistic Reasoning Underpins Expert-like Modeling  4.3 Characterizing Causal Mechanistic Reasoning Across Different Reactions  4.4 Eliciting Causal Mechanistic Reasoning—Attention to Scaffolding  4.5 Causal Mechanistic Reasoning in Organic Chemistry  4.6 Characterizing the Relationship Between Reasoning and Arrow Drawings  4.7 Summary  4.8 Strategies for Fostering Causal Mechanistic Reasoning in Learning Environments  Acknowledgements  References 

Chapter 5 S  tudents’ Reasoning in Chemistry Arguments and Designing Resources Using Constructive Alignment  Jacky M. Deng, Myriam S. Carle and Alison B. Flynn

5.1 Introduction  5.1.1 Citizens Need to be Able to Reason with Scientific Evidence  5.2 Framework—Reasoning, Granularity, and Comparisons  5.2.1 Modes of Reasoning 

49 50 50 50 52 52 52 53 54 54 59 59 61 62 64 66 70 71 71 72 72 74 74 74 75 76



xviii



5.2.2 Levels of Granularity—Moving Between Grain Sizes  5.2.3 Comparison—Considering Alternatives  5.3 Students’ Arguments Can Vary Between Tasks  5.4 Supporting Student Learning Through Constructive Alignment  5.4.1 Instructional Design  5.4.2 Scaffolding Skill Development  5.4.3 Resources for Constructively Aligning Reasoning into a Course  5.5 Conclusions  References 

76 78 78 80 80 82 84 84 85

Chapter 6 F  rom Free Association to Goal-directed Problem-solving— Network Analysis of Students’ Use of Chemical Concepts in Mechanistic Reasoning  90 Gyde Asmussen, Marc Rodemer, Julia Eckhard and Sascha Bernholt

6.1 Introduction  6.2 Theoretical Background  6.2.1 Reasons for Students’ Difficulties with Mechanistic Reasoning  6.2.2 Organization of Knowledge Structure Through Cognitive Networks  6.3 Research Questions  6.4 Method  6.4.1 Cohort  6.4.2 Case Comparison Tasks  6.4.3 Data Collection and Analysis  6.5 Results  6.6 Discussion and Conclusions  6.6.1 Implications for Teaching  Acknowledgements  References 

90 91 91 92 93 94 94 94 95 98 102 105 106 107

Chapter 7 E  pistemic Stances in Action—Students’ Reasoning Process While Reflecting About Alternative Reaction Pathways in Organic Chemistry  110 Leonie Lieber and Nicole Graulich

7.1 Introduction  7.1.1 Reasoning in Students’ Argumentation  7.1.2 Toward an Understanding of Epistemic Stances  7.2 Research Questions  7.3 Study Design and Methods  7.3.1 Data Analysis 

110 111 112 113 113 114





xix

7.4 Results and Discussion  7.4.1 Case 1—Taylor  7.4.2 Case 2—Robin  7.5 Conclusion and Implications  Acknowledgements  References 

115 115 119 122 123 123

Chapter 8 H  ow Do Students Reason When They Have to Describe the “What” and “Why” of a Given Reaction Mechanism?  125 Jolanda Hermanns and David Keller

8.1 Introduction  125 8.2 Theoretical Background—Mechanistic Reasoning and Writing-to-learn in Organic Chemistry  126 8.3 Research Questions  126 8.4 Methods  127 8.4.1 The Course “Training OC”  127 8.4.2 Sample  127 8.4.3 The Coding Process  127 8.5 Results and Discussion  129 8.5.1 RQ1: What is the Quality of Students’ Reasoning Regarding Their Description of the “What” of the Given Reaction Mechanism?  129 8.5.2 RQ2: What is the Quality of Students’ Reasoning Regarding Their Description of the “Why” of the Given Reaction Mechanism?  134 8.6 Limitations  137 8.7 Implications  138 Acknowledgements  139 References  139

Chapter 9 I n-the-moment Learning of Organic Chemistry During Interactive Lectures Through the Lens of Practical Epistemology Analysis  Katie H. Walsh, Jessica M. Karch and Ira Caspari-Gnann

9.1 Introduction  9.1.1 Practical Epistemology Analysis (PEA)  9.2 Methodology  9.2.1 Study Context  9.2.2 Data Collection  9.2.3 Data Analysis  9.3 Results and Discussion  9.3.1 What Drives Student In-the-moment Learning—Gap Patterns  9.3.2 How Students Learn In-the-moment of Group Discussions—Relation Patterns 

141 141 143 145 145 145 146 146 147 152



xx



9.4 Conclusions and Implications  cknowledgements  A References 

155 156 156

Chapter 10 F  lipped Classrooms in Organic Chemistry—A Closer Look at Student Reasoning Through Discourse Analysis of a Group Activity  161 Suazette R. Mooring, Nikita L. Burrows and Sujani Gamage

10.1 Introduction  10.1.1 Pre-class Activity—Videos  10.1.2 Pre-class Activity—Quizzes  10.1.3 In-class Activity—Student Response Systems  10.1.4 In-class Activity–Group Work  10.2 Student Dialogue in a Flipped Course—A Case Study  10.2.1 The ICAP Framework  10.2.2 Argumentation and Student Reasoning in Organic Chemistry  10.2.3 Course Context and Participants  10.2.4 Group Quiz Format  10.2.5 Data Collection and Analysis  10.3 Findings  10.3.1 Group A Summary  10.3.2 Quiz 2, Prompt 5—Group B  10.3.3 ICAP Analysis—Comparison of Group A to Group B  10.3.4 Argumentation—Comparison of Group A to Group B  10.4 Conclusions and Implications  10.4.1 Scaffolding Questions to Promote Argumentation  10.4.2 Group Composition and Roles  10.4.3 Incorporating Student Observations in Assessment of Group Activities  Acknowledgements  References 

161 162 163 163 163 165 166 167 167 168 168 169 169 171 173 174 174 175 176 176 176 176

Chapter 11 S  ystemic Assessment Questions as a Means of Assessment in Organic Chemistry  179 Gulten Sendur

11.1 Introduction  179 11.2 The Role of Scientific Reasoning Skills in Developing Meaningful Understanding in Organic Chemistry  180 11.3 Assessment of Students’ Meaningful Understanding in the Context of SATL  181



xxi



11.3.1 Systemic Diagrams and Systemic Assessment Questions  183 11.3.2 Assessment of SAQs  186 11.4 Research on Systemic Diagrams in Organic Chemistry Education  187 11.5 Example of an Activity to Assess Students Meaningful Understanding with SAQs Diagrams in Organic Chemistry Lessons  190 11.6 Conclusions and Implications  192 References  193



Chapter 12 V  ariations in the Teaching of Resonance—An Exploration of Organic Chemistry Instructors’ Enacted Pedagogical Content Knowledge  195 Emily L. Atieh, Jherian K. Mitchell-Jones, Dihua Xue and Marilyne Stains

12.1 Introduction  12.2 Theoretical Framework  12.2.1 PCK in the Sciences  12.2.2 Coming to a Consensus on PCK  12.2.3 Tying It All Together  12.3 Methods  12.3.1 Participants  12.3.2 Data Collection  12.3.3 Data Analysis  12.4 Results  12.4.1 Grouping Instructors by ePCK  12.4.2 Integrating ePCK Components  12.4.3 Student Conceptions of the Resonance Hybrid  12.5 Discussion  12.5.1 RQ1—Characterizing Instructors’ ePCK  12.5.2 RQ2—Instructor ePCK and Student Outcomes  12.6 Limitations  12.7 Conclusions and Implications  Acknowledgements  References 

195 197 197 198 199 200 200 200 202 203 203 205 206 207 207 208 210 211 211 211

Chapter 13 I nvestigation of Students’ Conceptual Understanding in Organic Chemistry Through Systemic Synthesis Questions 214 Tamara Rončević, Dušica D. Rodić and Saša A. Horvat

13.1 Introduction—Conceptual Understanding in Organic Chemistry  214 13.2 Theoretical Foundation  216



xxii



13.2.1 Organic Reaction Mechanism Problems and Mechanistic Reasoning  13.2.2 Mental Models and Conceptual Models  13.2.3 Systemic Diagrams and Systemic Assessment Questions as Effective Conceptual Models  13.3 Assessing the Quality of Students’ Mental Models and/or Conceptual Structures in Organic Chemistry  13.3.1 Research Problem, Objectives and Tasks  13.3.2 Description of Scoring Scheme Applied to the Students’ Generated SSynQs and Obtained Results  13.4 Concluding Remarks and Implications for Instruction  Acknowledgements  References 

Chapter 14 D  isciplining Perception Spatial Thinking in Organic Chemistry Through Embodied Actions  Mike Stieff, Stephanie Scopelitis and Matthew Lira

14.1 Introduction  14.1.1 Perceptual Learning with Visual Representations  14.1.2 Disciplining Perception Through Embodied Actions  14.2 Present Study  14.2.1 Methods  14.2.2 Case 1—Making the Steps for Spatial Thinking Visible  14.2.3 Case 2—Performing Spatial Thinking in a Large Lecture Hall  14.2.4 Cross-case Analysis  14.3 Conclusion  Acknowledgements  References 

216 217 219 222 222 223 227 229 229 232 232 234 235 237 238 238 241 243 244 245 245

Chapter 15 B  uilding Bridges Between Tasks and Flasks—Design of a Coherent Experiment-supported Learning Environment for Deep Reasoning in Organic Chemistry  248 Andreas Trabert, Catharina Schmitt and Michael Schween

15.1 Introduction  15.2 State of Research and Approach to Design  15.2.1 Research on Student Reasoning  15.2.2 Design Objectives and Design Principles 

248 251 251 252





xxiii

15.2.3 Aggregation and Arrangement of Reaction Mechanisms and Concepts in a Coherent Learning Environment  15.3 Developments for Secondary and Tertiary Education  15.3.1 Secondary Education: Learning to Think in Mechanistic Alternatives—SN1 vs. E1 Reactions  15.3.2 Tertiary Education: Exploring Electronic Substituent Effects—Alkaline Hydrolysis of Substituted Ethyl Benzoates  15.4 Implications for Implementation and Teaching  15.5 Conclusion  Acknowledgements  References 

254 256 256 257 260 263 264 264

Chapter 16 A  ssessment of Assessment in Organic Chemistry—Review and Analysis of Predominant Problem Types Related to Reactions and Mechanisms  269 Gautam Bhattacharyya

16.1 Introduction  16.1.1 Chapter Scope  16.2 Individual Reactions  16.3 Synthesis  16.3.1 Student Solutions to Traditional Synthesis Tasks  16.3.2 Non-traditional Assessment of Synthesis  16.4 Electron-pushing Mechanisms (EPMs)  16.4.1 Traditional Electron-pushing Tasks  16.4.2 Non-traditional Mechanistic Reasoning Tasks  16.5 Conclusions  Acknowledgements  References 

Chapter 17 D  eveloping Machine Learning Models for Automated Analysis of Organic Chemistry Students’ Written Descriptions of Organic Reaction Mechanisms  Field M. Watts, Amber J. Dood and Ginger V. Shultz

17.1 Introduction  17.1.1 Eliciting Students’ Mechanistic Reasoning in Organic Chemistry Through Writing  17.1.2 Machine Learning for Analyzing Student Writing in Chemistry 

269 270 271 273 273 276 278 278 281 281 283 283

285 285 286 287



xxiv





17.2 Theoretical Framework  17.3 Research Questions  17.4 Methods  17.4.1 Setting and Participants  17.4.2 Writing-to-learn Assignments and Implementation  17.4.3 Data Collection  17.4.4 Data Analysis  17.5 Results and Discussion  17.5.1 RQ1—How do Students Respond to WTL Assignments Intended to Elicit How and Why Organic Reaction Mechanisms Occur?  17.5.2 RQ2—Does Automated Text Analysis Allow for Predictions of the Components Included in Students’ Written Mechanistic Descriptions?  17.6 Implications  17.6.1 Implications for Research  17.6.2 Implications for Practice  17.7 Limitations  17.8 Conclusions  References 

Chapter 18 D  evelopment of a Generalizable Framework for Machine Learning-based Evaluation of Written Explanations of Reaction Mechanisms from the Post-secondary Organic Chemistry Curriculum  Jeffrey R. Raker, Brandon J. Yik and Amber J. Dood

18.1 Are Drawn Reaction Mechanisms Enough to Evaluate Understanding?  18.2 Learner Understanding of Reaction Mechanisms  18.3 Assessment of Learner Understanding of Reaction Mechanisms  18.4 Training Machine Learning Models for Automated Text Analysis  18.5 Framework for Evaluating Understanding of Reaction Mechanisms  18.5.1 Levels of Explanation Sophistication  18.5.2 Evaluating Understanding of Electrophiles  18.6 Implications for Educators  18.7 Implications for Researchers  18.8 A Path toward Better Learning  Acknowledgements  References 

288 288 289 289 289 291 291 293 293

297 299 299 300 300 301 301

304

304 305 306 308 310 311 312 313 315 316 316 316



xxv

Chapter 19 T  he Central Importance of Assessing “Doing Science” to Research and Instruction  Cara E. Schwarz, Kimberly S. DeGlopper, Aubrey J. Ellison, Brian J. Esselman and Ryan L. Stowe

19.1 Introduction  19.2 Assessment 101  19.2.1 Observation  19.2.2 Interpretation  19.2.3 Conceptual Change  19.2.4 How Observation, Interpretation, and Cognition Work Together  19.3 Assessing Work Aligned with the Practice of Chemistry  19.4 3D Assessments as Research Tools  19.5 3D Assessments as a Vital Part of 3D Learning Environments  19.6 Future Directions for Research on 3D Assessments  19.7 Conclusion  Acknowledgements  References 

320

320 321 321 322 323 324 325 329 330 333 335 335 335

Postface

338

Biographies of Authors

341

Subject Index 

346

SECTION A

           

Chapter 1

Students’ Attention on Curved Arrows While Evaluating the Plausibility of an Organic Mechanistic Step Melissa Weinrich*a and Ryan Britta a

Department of Chemistry and Biochemistry, University of Northern Colorado, Greeley, Colorado, 80639, USA *E-mail: [email protected]

1.1 Introduction Reaction mechanisms are a central tool for organic chemists in making predictions and understanding how chemical processes occur. Learning to compare reactions and judge the plausibility of reaction mechanisms is a key skill for organic chemistry students to learn and has been the topic of study by the following researchers. Bode, Deng, and Flynn1 asked students to discuss which of two nucleophilic substitution mechanisms were plausible and categorized students’ reasoning as descriptive, relational, or causal. Most students provided causal arguments for their claims but did not provide the expected level of granularity and struggled to identify relevant features of the problems. Caspari and Graulich2 developed a teaching scaffold to assist students in considering multiple reaction pathways in organic chemistry mechanisms. Using the scaffold, students were able to identify implicit features   Advances in Chemistry Education Series No. 10 Student Reasoning in Organic Chemistry: Research Advances and Evidence-based Instructional Practices Edited by Nicole Graulich and Ginger Shultz © The Royal Society of Chemistry 2023 Published by the Royal Society of Chemistry, www.rsc.org

3

Chapter 1

4 3

within a problem. Lieber and Graulich examined the reasoning of students while judging the plausibility of reaction pathways and found that providing students with scaffolded opportunities to reflect on their reasoning lead students toward more meaningful learning. In each of these studies, students were interacting with representations of chemical phenomena. When interacting with a representation we need to understand how the representation encodes information. To do this we make connections between the represented world (e.g., a sample of butane gas) and the representing world (e.g., a line structure of butane showing the connectivity of atoms).4 When categorizing representations into groups, Kozma and Russel5 found that students focused on surface features of the representations. This focus on surface features has been demonstrated in many contexts within organic chemistry education research.6–15 For example, students often take a connect-the-dots or decorating-with-arrows approach to proposing mechanisms and may not attribute the appropriate meaning to the symbols commonly used in organic chemistry.16–18 When solving problems students tend to focus on the structure of a substance instead of its function9,10,14,17,19–21 and they tend to focus on just one feature of the problem.22 How a representation is presented can influence what students do and how successful they are while problem solving.23 Flynn & Featherstone24 found that students scored higher on problems when atoms were explicitly shown than when carbons and hydrogens were implicit. DeCocq & Bhattacharyya25 showed that providing students with the product—in addition to the  reactants—when asked to propose a mechanism, can push students toward providing reasoning based on backward-chaining (i.e., proposing a mechanism to get to the product) over forward-chaining (i.e., proposing a mechanism based on properties and interactions of the starting materials). These students preferred line structures over other types of representations because they perceived line structures to have the most relevant and least distracting information. Successful and unsuccessful students may interact with representations in different ways. While solving mechanism problems and given the option between a ball-and-stick representation and an electrostatic potential map, students spent more time viewing the ball-and-stick representation.26 However, when asked to identify the location on a molecule with the most partial positive charge, students who were more successful at the task spent more time viewing the electrostatic potential map. Stieff  et al.27 showed that when given multiple representations to solve a problem, students visually focused on just two of the three representations available. In an eye tracking study on students interpreting infrared spectra, Cullipher and Sevian28 found that students with lower conceptual sophistication tended to focus more on the atoms present. Students with more advanced conceptual sophistication had patterns of viewing the spectra that indicated they were relating the structure of the substances to the spectra. Another eye tracking study on students’ interpretation of nuclear magnetic resonance (NMR) spectra showed that novice students tend to evenly distribute their attention across the features of the problem, whereas experts tend to

Students’ Attention on Curved Arrows

5 29

focus their attention on a small number of areas of interest (AOIs). Rodemer et al.30 conducted an exploratory eye tracking study to gauge visual scan patterns in students while working organic chemistry problems. They also found that the more advanced students took less time overall to solve the problems. Additionally, they found that both beginner and advanced students gave more attention to reactants than products. Overall, the more advanced students made more transitions between AOIs than beginners did, leading to a lower fixation-to-transition ratio. This past eye tracking research has provided information on the types of representations students focus on in organic chemistry and how students transition between different parts of representations but it has not characterized what students visually focus on while evaluating the plausibility of reaction mechanisms and how that relates to the types of features they discuss in their reasoning. The relationships between students’ eye gaze and their reasoning strategies in organic chemistry is an area in need of investigation.30 Additionally, curved arrow notation is central for chemists in communicating how a chemical reaction happens, but students struggle to correctly use this notation to describe the movement of electrons in a reaction.16 Thus, we wanted to better understand how students visually focus on curved arrows and how this relates to the types of features they discuss and their reasoning patterns while judging the plausibility of proposed reaction mechanism.

1.2 Theoretical Framework 1.2.1 Abstractness When interacting with a representation, students need to be able to extract encoded information from a representation, make connections to prior instances, and extrapolate that knowledge to the new representation. Thus, when solving problems using representations students need to use abstraction. The representation mapping model developed by Hahn and Chater31 characterizes students’ abstraction by comparing the representations they use and generate while solving a problem. The relative level of abstractness of the students’ representations can be characterized by considering how concrete or how removed from surface features the students’ referents are for the students’ representations.14 With this framework, relative lower and higher levels of abstractness can be identified from the following indicators: (1) focusing on explicit vs. implicit features of the problem; (2) describing a sequence of events vs. focusing on properties of entities and explanations of events; (3) focusing on the structure of substances vs. their function; and (4) using specific vs. generalized terminology.14 Consider hypothetical students’ responses to proposing a mechanism for the reaction between the substances shown in Figure 1.1. While interacting with these structures a student could say “The oxygen on ethoxide would attach to the carbon that the bromide is attached to and that would kick out the bromide.” However, another student might say “The oxygen has a negative charge so it can act as

6

Chapter 1

Figure 1.1 Example substances. a nucleophile and be attracted to the partial positive charge on the carbon that the bromide is attached to, kicking out the bromide.” The first student focused on the explicit feature of the atom symbols in the problem, whereas the second student focused on the concept of partial charge. They used that partial charge to explain the events they proposed whereas the first student described just a sequence of events. While describing this sequence of events the first student focused on the structure of the substances and the second student considered the function of those structures (i.e., act as a nucleophile). Additionally, the second student used more generalized terminology (nucleophile) and the first student used more specific terminology (ethoxide). In these examples, for each indicator of abstractness the first student had relatively lower levels of abstractness than the second student. Even though the second student demonstrated higher levels of abstractness, that does not mean the student was more successful. In both cases the students incorrectly assumed that a substitution process would predominate in this reaction over an elimination reaction. This abstractness framework14,31,32 guided our investigation of the types of features students discuss while judging the plausibility of reaction mechanisms.

1.2.2 Student Reasoning Although there are many frameworks to characterize student reasoning, we selected a reasoning framework that has specifically been used to characterize students’ reasoning when thinking about organic reaction mechanisms. Thus, our investigation of students reasoning was framed around them chaining reasoning strategies.33 Darden defined chaining as “reasoning about one part of a mechanism on the basis of what is known or conjectured about other parts of a mechanism” (p. 362).34 With their framework, chaining can be further categorized as “forward” or “backward.” Forward chaining occurs when the properties of an entity are used to infer subsequent mechanistic steps. For example, a student may note that a carbonyl-containing compound is present with a nucleophilic reagent and predict that nucleophilic addition will take place. Thus, the student is reasoning in the forward direction. In contrast, backward chaining uses knowledge of a future step in the mechanism to make inferences about prior steps. For example, if the starting material contains an alcohol functional group and the product does not, a student may determine that the hydroxyl group must function as a

Students’ Attention on Curved Arrows

7

leaving group at some point, which further suggests a prior protonation step. Backward chaining can also be used to rule out a proposed mechanism if it would generate an implausible or unproductive intermediate.

1.2.3 Eye Tracking Eye tracking technology can measure where on a visual stimulus a participant is looking and how long they look at the visual stimulus by recording eye fixations (pauses in eye movement). This research was guided by the eyemind assumption of eye tracking, which presumes that when people fixate their visual attention on a referent, they are mentally processing that referent.35 Thus, eye fixations are a good measure of where students are focusing their attention.36 An eye tracking instrument can record the duration of these eye fixations on specific AOIs. This fixation duration (length of time that eye movement pauses on an AOI) can then be analyzed to compare how long a participant spends processing different areas of the stimulus.37 Although students eye movements are highly correlated with their verbal descriptions, eye data are paired with verbal interviews to better understand students’ cognitive processes.27 For example, correlation analysis can be used to compare fixation duration to other measures such as students’ accuracy at answering questions.26

1.3 Research Questions To investigate the types of features students paid attention to and considered while judging the plausibility of a proposed reaction mechanism, we explored the following research questions. RQ1: What was the relationship between the types of chemical features (implicit/explicit) students discuss and the proportion of time they viewed curved arrow AOIs? RQ2: What was the relationship between the chaining types of student reasoning (sequence, forward chaining, backward chaining) and the percentage of time they viewed curved arrow AOIs? RQ3: What was the relationship between students’ success with judging the plausibility of reaction mechanisms and the percentage of time they viewed curved arrow AOIs?

1.4 Methods 1.4.1 Context and Participants Undergraduate students were recruited from a second semester organic chemistry course at a public university in the Rocky Mountain region of the United States. Twenty students participated in this study during the last four weeks of the Spring 2019 semester. Participants were offered extra credit for

8

Chapter 1

participating. The Institutional Review Board of the University of Northern Colorado approved this research.

1.4.2 Data Collection In this mixed method study, quantitative data was collected using eye tracking and qualitative data was collected through interviews. The instrument used in this study was created by first reviewing lecture notes provided by the instructor of the course and students’ mid-term exam response. Based on these lecture notes and student generated mechanisms, eleven organic mechanistic steps (Figure 1.2) with curved arrows were created. Some steps were drawn in a plausible manner (Q1, 3, 5, 8 and 10), and others were implausible (Q2, 4, 6, 7, 9 and 11). The mechanistic steps were reviewed by the instructor of the course for their appropriateness. Within Tobii Pro Studio AOIs were pre-assigned to each starting material, curved arrow, and question prompt. Each structure and arrow had sufficient white space between them for the eye tracking instrument to distinguish these AOIs (1 cm for this instrument).38 Each AOI used in this study can be found in the supplemental information. Student eye movements were captured using a Tobii T120 eye tracker (120 Hz), which recorded eye fixation durations on each AOI. During data collection, a five-point calibration was used. Data collection was piloted in the previous summer session with three students to check that the question format was understandable. During data collection, students were shown each proposed mechanism one at a time and asked to rate each one as “plausible” or “implausible”. They moved through each problem at their own pace while the eye tracker captured their eye movements. During the eye tracking portion of data collection, students did not talk. After working through all the problems, students were interviewed and asked to explain their choice of “plausible” or “implausible” for each question. Students were asked follow-up questions to elaborate their reasoning (i.e., “You said x, what does that mean?”, “Can you tell me more about that?”, “How did you know that could/could not happen?”, “Is there anything else that tells you this is a plausible/an implausible step?”). Students could write while describing their thinking using a Livescribe pen. Audio recordings of interviews were transcribed.

1.4.3 Data Analysis Thematic analysis on the interview transcripts was done in NVivo to characterize students’ abstractness, reasoning patterns, and success at judging the plausibility of a mechanistic step.39 We characterized students’ relative abstractness based on the terminology they used (specific vs. general) and the types of features students discussed (explicit vs. implicit). When students used the specific name of a substance, we coded that as specific terminology. When students discussed an entity with a more generalized name (i.e.,

Students’ Attention on Curved Arrows

9

Figure 1.2 Eleven organic mechanism steps where students were prompted to evaluate if each proposed mechanism was plausible (Q1, 3, 5, 8 and 10) or implausible (Q2, 4, 6, 7, 9 and 11).

10

Chapter 1

alcohol, nucleophile) we coded that as generalized terminology. If students referenced a feature that was clearly indicated in the problem (such as an atom, charge, lone pair dots, etc.), it was coded as explicit. When students talked about a feature that was not clearly indicated (partial charge, carbons and hydrogens hidden in line structures, etc.), it was coded as implicit. Each of these was coded a maximum of once per problem that a student answered to represent if that code was present in that student’s response to that problem. We were also interested in the type of justification provided by students for their answers. There were broadly three types of justification provided by students. First, some merely described the sequence of events as they understood them, without appealing to any properties of the starting material or product. Second, students employed forward chaining, making a prediction by analyzing the properties and reactivity of the starting material. Third, students also used backward chaining, imagining a future state or intermediate and using that conception to justify earlier mechanistic steps. Each of these was coded a maximum of once per problem that a student answered to represent if that code was present in that student’s response to that problem. Finally, to compare students’ success at judging the plausibility of these mechanistic steps their verbal and written responses were scored to represent their performance on these problems, as if grading on an exam. This scoring scale ranged from 0 to 3 (0 = unsuccessful; 1 = mostly unsuccessful; 2 = mostly successful; 3 = successful). For example, a student scored 0 if they did not demonstrate a correct concept. This included arriving at the anticipated (canonically correct) response of plausible or implausible based on the material they learned in class but making this judgement for faulty reasons. A student scored a 1 if their overall response was incorrect, but they demonstrated at least one correct concept. For example, they might correctly describe that partial positive and partial negative charges attract, but incorrectly identify these charges throughout the problem. A student scored a 2 if their overall response was correct, but they demonstrated at least one incorrect concept. For example, they might correctly identify partial charges throughout the problem and that these charges attract but might not evaluate which partial charges are most likely to interact. A student scored a 3 if they demonstrated no errors based on the material they learned in their course. To establish intercoder agreement, both authors coded five interviews (25% of the data). Initially, there was 93% agreement and code applications were discussed until agreement was reached. The first author then coded the remaining interviews. Eye tracking data was analyzed using Tobii Pro Studio. Total fixation duration on each AOI (the words of the prompt, structures of the starting materials, and curved arrows) were converted to percentages based on the total time a student spent on all AOIs. This was done to be able to make comparisons since each student worked through the problems at their own pace. Spearman’s correlations were run in SPSS to compare the proportion of time students fixated on curved arrow AOIs with the frequency of occurrence across the eleven problems of each type of reasoning pattern, type of feature (explicit/implicit) discussed, terminology used (specific/general), and their

Students’ Attention on Curved Arrows

11

success. The Spearman correlation was selected over other correlation statistics because not all variables of interest were normally distributed. Eleven data sets from each of these 20 participants were analyzed to give a total of 220 data sets for this study.

1.5 Results and Discussion To answer the research questions, we will first describe the features, terminology, reasoning, and AOIs students used. During these interviews, students paid attention to a variety of features (explicit and implicit) to judge the plausibility of these mechanisms. They used specific and generalized terminology. Some students described a sequence of events without an explanation for why that sequence of events could occur, whereas other students used forward or backward chaining to justify their decision of the plausibility of a proposed mechanism.

1.5.1 Explicit and Implicit Features Students mentioned both explicit and implicit features of the problems. Examples of the explicit features students discussed included the atoms, bonds, lone pair dots, or charges shown in the problem. Examples of the implicit features students discussed included carbons and hydrogens hidden in line structures, electrons that were not explicitly shown, partial charges, hypothetical charges and bonds arising from proposed mechanisms and implicit properties of the entities such as electronegativity, acidity, and nucleophilicity, etc. For example, the following student discussed mainly explicit features:     I would say implausible at the moment because the arrow should be pointing the other way as it wants the hydrogen. Then the hydrogen would be cleaved off so there might be an arrow pointing towards the bonds of the PH3PCH2. […] As I’ve been looking at this more and more, it seems the arrow pointing the other way means that it will take whatever is there – Q6, P. 17     This student paid attention to features that were explicitly shown on the problem such as an arrow, the atoms, and bonds. Another student discussed a combination of explicit and implicit features:     I think this one is plausible. It’s just PH3 P group. I think it would be able to ‘cause bromine is a really good leaving group so if this phosphorus wanted to come in and attack this partial positive on the methyl. Bromine would be a fine leaving… It’s a good leaving group. Then it would have PH3 P with a CH3 group – Q10, P. 12     This student mentioned the explicitly shown features of the atoms present, but also considered the implicit partial charge of the atoms, which was not explicitly shown in the problem (Figure 1.3).

Chapter 1

12

Figure 1.3 Participant 12’s written response to Q10.

1.5.2 Specific and General Terminology These students discussed specific and general terminology. Examples of specific terminology these students used included the names of substances involved such as hydronium, water, chloride, and methyl group. Examples of the general terminology students used included the terms acid/base, nucleophile/electrophile, leaving group, and functional group names. In the following two quotations, the first student used specific terminology (hydronium), whereas the second student used more general terminology (acid).     The oxygen will attack, with its negative charge, on the hydronium and protonate itself – Q3, P. 9    

Cause it’s an acidic solution it would form the activated carbonyl right here by abstracting this hydrogen – Q11, P. 11   

1.5.3 Reasoning Based on Sequence vs. Chaining While judging the plausibility of these mechanistic steps, many students described a sequence of events, whereas other students explained why that sequence of events could or could not happen using either forward and/or backward chaining. Consider this student’s reasoning for judging a step to be plausible:     I said yes because all that’s really happening is a methyl group is being added to the phosphorus. Then the bromine is detached from the methyl becoming an ion – Q10, P. 14    

   

Interviewer: How did you know that methyl group could get added to the phosphorus?

P14: Because the double bond is reaching out and attaching to the carbon of the methyl and then simultaneously bromine comes off as an ion. They become two separate compounds and the methyl is able to attach to the phosphorus.    

Students’ Attention on Curved Arrows

13

This student justified the plausibility of the mechanism by describing the steps they saw occurring. When asked follow-up questions, this student continued to justify their decision by just describing the sequence they saw. However, other students used properties of the substances involved to justify their reasoning. For example,     Adding those electrons to that would make it stable. […] ‘cause this one right here makes it stable and adding this one here in order to make a double bond – Q3, P. 3     To this student, electrons were added to make a stable double bond in the product. This student imagined the product as a “stable” substance and determined the mechanism to be plausible based on this feature of the product. This student was using backward chaining. Other students focused on properties of the starting materials to justify their decision.     We have an activated carbonyl because of the oxygen being positively charged. You have ethanol so that’s a weak nucleophile which would kick up those electrons. That seems plausible that it could happen – Q8, P. 13     This student justified the plausibility of the mechanism step by describing the charge and nucleophilicity of the starting materials. This student used forward chaining to conclude the mechanism was plausible. The average frequency of each type of terminology, feature, and reasoning across the eleven problems is presented in Table 1.1. For example, these students used forward chaining on average in 5.6 out of the 11 problems.

1.5.4 AOIs The proportion of time spent viewing curved arrows correlated with the use of generalized terminology and some implicit features, but not the types of reasoning students provided nor their success. Students spent most of their time viewing the structures of the starting materials (79%), compared to AOIs of the words of prompt (6%) and curved arrows (15%). There was some variability in the proportion of time students spent viewing the curved arrows, with some students viewing the curved arrows for a larger proportion of time (22% maximum) than other students (10% minimum). A comparison the proportion of time students viewed curved arrow AOIs to their use of terminology, explicit/implicit features, reasoning, overall time spent, and success showed a few significant correlations (Table 1.2). RQ1: What was the relationship between the types of chemical features (implicit/explicit) students discuss and the proportion of time they view curved arrow AOIs? Students who spent a larger proportion of their time viewing the arrows also tended to use general terminology (such as functional group names) and describe some implicit features of a problem such

Chapter 1

14

Table 1.1 Average frequency of occurrence of each code across the eleven problems. Code

Average occurrence

Terminology Specific terminology(i.e., name of substance) General terminology(i.e., name of functional group)

4.25/11 3.7/11

Explicit features Atoms (explicit) Bonds (explicit) Lone pair dots (explicit) Charge (explicit)

9.8/11 4.9/11 3.6/11 4.6/11

Implicit features Atoms (implicit) Bonds (implicit) Electrons (implicit) Charge (implicit) Functional group Octet rule Nucleophile

4.3/11 2.5/11 3.9/11 3.5/11 2.6/11 0.7/11 1.8/11

Reasoninga Sequence without explanation Forward Chaining Backward Chaining

2.8/11 5.6/11 3.4/11

Success

1.7/3

a

These add to more than 11 because students could use both backward and forward chaining in the same problem.

as partial charge. They tended to double check that the resulting structure would follow the octet rule. RQ2: What is the relationship between the chaining types of student reasoning (sequence, forward chaining, backward chaining) and the percentage of time they view curved arrow AOIs? There was not a significant correlation between proportion of time spent viewing curved arrows and the type of reasoning the student demonstrated. RQ3: What is the relationship between students’ success with judging the plausibility of reaction mechanisms and the percentage of time they view curved arrow AOIs? Students’ success at solving the problems did not appear to correlate with time spent viewing the curved arrows.

1.5.5 Success Student success correlated with the use of generalized terminology and discussion of implicit features but not the types of reasoning students provided nor the proportion of time viewing curved arrows. There was a range of students’ performance in judging the plausibility of these proposed mechanistic steps across the eleven problems (average score 1.7/3, minimum average score 0.3/3, maximum average score 2.6/3). A comparison of students’ success

Students’ Attention on Curved Arrows

15

Table 1.2 Spearman correlations between the proportion of time students spent

viewing curved arrows and the types of terminology they used, features they discussed, reasoning, total time problem solving, and students’ success. Proportion of time viewing curved arrow AOIs

Student success

Spearman correlation Sig (2-tailed)

Spearman correlation Sig (2-tailed)

Specific terminology(i.e., name of substance) General terminology(i.e., name of functional group)

0.420

0.065

0.210

0.374

0.476a

0.034

0.559a

0.010

Sum of explicit features Atoms (explicit) Bonds (explicit) Lone pair dots (explicit) Charge (explicit)

−0.021 −0.342 −0.299 −0.069 0.066

0.831 0.164 0.200 0.774 0.781

0.051 −0.061 −0.305 0.130 −0.043

0.831 0.799 0.190 0.586 0.856

Sum of implicit features Atoms (implicit) Bonds (implicit) Electrons (implicit) Charge (implicit) Functional group Octet rule Nucleophile

0.306 −0.331 −0.039 −0.357 0.522a 0.619b 0.620b 0.307

0.190 0.154 0.869 0.122 0.018 0.004 0.004 0.187

0.618b 0.358 −0.048 0.148 0.354 0.489a 0.045 0.444a

0.004 0.121 0.841 0.534 0.126 0.029 0.849 0.050

−0.138

0.563

−0.061

0.797

0.240 −0.257

0.308 0.274

0.291 −0.135

0.214 0.569

−0.420 0.142

0.066 0.549

−0.421 —

0.064 —

Code

Reasoning Sequence without explanation Forward Chaining Backward Chaining Total time Success a

 orrelation is significant at the 0.05 level. C Correlation is significant at the 0.01 level.

b

to their use of terminology, explicit/implicit features, reasoning, overall time spent, and proportion of time spent viewing curved arrows showed a few significant correlations (Table 1.2). Students who were more successful at solving these problems also tended to use general terminology and discuss implicit entities. They tended to use functional group names, discuss the presence of acids or bases, and describe the actions of a nucleophile in a mechanism. However, there was not a significant correlation between students’ success and the proportion of time spent viewing curved arrows. Previous research has shown the more

16

Chapter 1

successful problem solvers tend to spend less time overall working through a problem.26 Although there appeared to be a slight trend in students who spent more time working through the problems being less successful, this was not significant. Additionally, students’ success at solving the problems did not appear to correlate with time spent viewing the curved arrows.

1.6 Conclusions, Implications, and Limitations While students judged the plausibility of proposed mechanism steps, we characterized the types of features they focused on verbally and how that related to their visual attention on curved arrows. Students in the study described either explicit or implicit features (or both) in their reasoning. The language they used included either specific or general terminology (or both). Unsurprisingly, students spent more of their time viewing the reactants displayed than the curved arrows. A comparison of the proportion of time students spent viewing arrows to the types of features they discussed showed that students who spent more time viewing the arrows also tended to discuss some implicit features (such as identifying functional groups or partial charges) and use general terminology (such as talking about a functional group or nucleophile) while justifying their assessment of the plausibility of the reaction mechanism. This could imply that students who consider implicit features and general terminology might find arrows as useful representations to process these concepts. Alternatively, focusing on curved arrow representations might help students to consider implicit features and general terminology. They tended to follow the flow of electrons and verbally double check that the resulting structure would fulfill the octet rule. This could imply that curved arrows were a useful tool for students to think through this concept. Additionally, we found that successful students tended to also discuss implicit entities and general terminology (such as identifying that an acid, or nucleophile, or functional group was present). This aligns with previous studies that have reported students who focus on implicit features can be successful at problem solving.14,33,40 Although we identified relationships between the proportion of time students viewed curved arrows and the types of terminology (specific/general) and entities (implicit/explicit) they discussed (RQ1), we did not identify any significant relationships between the proportion of time students viewed curved arrows and the types of chaining reasoning (sequence without explanation, forward chaining, backward chaining) they used (RQ2). Since the products were not shown, this may have influenced the types of reasoning strategies students used.25 Additionally, we did not identify any significant relationships between the proportion of time students viewed curved arrows and their success at judging the plausibility of these mechanism steps (RQ3). As with many eye-tracking studies (i.e., Williamson et al.)26 the small size of the sample in this study may have limited the results. These results have implications for both future research and teaching. For teaching, having students practice reaction mechanisms with curved arrows may not be enough

Students’ Attention on Curved Arrows

17

to develop their reasoning patterns. Instead, it might be more productive to focus on building additional conceptual scaffolding to help students think through the flow of electrons and their reasoning.3 Since we did not see a relationship between the students’ reasoning pattern and the proportion of time they spent viewing curved arrows, future studies could explore students’ viewing patterns of curved arrows and other features of a problem such as atoms, bonds, lone pairs, and charges to understand how students consider the directionality of curved arrows. A limitation of this current study could be the visual complexity of the selected problems. Future eye tracking research could explore student reasoning with problems with a range of visual complexity. Additionally, further research is needed that goes into more detail of visual features in a reaction mechanism as this study could only compare students’ view of arrows and the whole structure of a starting material. This study could not go into detail within a structure such as the types of atoms, bonds, lone pairs, charges students focused on.

Acknowledgements We thank the participants of this study and the instructor of the course for providing access to participants, sharing their lecture notes, and reviewing the instrument. Additionally, we would like to thank Michael Franklin for providing feedback in the initial stages of the project.

References 1. N. E. Bodé, J. M. Deng and A. B. Flynn, J. Chem. Educ., 2019, 96, 1068–1082. 2. I. Caspari and N. Graulich, Int. J. Phys. Chem. Educ., 2019, 11, 31–43. 3. L. Lieber and N. Graulich, Chem. Educ. Res. Pract., 2021, 23, 38–54. 4. S. Ainsworth, Learn. Instr., 2006, 16, 183–198. 5. R. B. Kozma and J. Russell, J. Res. Sci. Teach., 1997, 34, 949–968. 6. D. S. Domin, M. Al-Masum and J. Mensah, Chem. Educ. Res. Pract., 2008, 9, 114–121. 7. A. Kraft, A. M. Strickland and G. Bhattacharyya, Chem. Educ. Res. Pract., 2010, 11, 281–292. 8. L. McClary and V. Talanquer, J. Res. Sci. Teach., 2011, 48, 396–413. 9. D. Cruz-Ramírez de Arellano and M. H. Towns, Chem. Educ. Res. Pract., 2014, 15, 501–515. 10. M. E. Anzovino and S. L. Bretz, Chem. Educ. Res. Pract., 2016, 17, 1019–1029. 11. K. R. Galloway, C. Stoyanovich and A. B. Flynn, Chem. Educ. Res. Pract., 2017, 18, 353–374. 12. K. R. Galloway, M. W. Leung and A. B. Flynn, Chem. Educ. Res. Pract., 2019, 20, 30–52. 13. N. Graulich and G. Bhattacharyya, Chem. Educ. Res. Pract., 2017, 18, 774–784.



18

Chapter 1

14. M. L. Weinrich and H. Sevian, Chem. Educ. Res. Pract., 2017, 18, 169–190. 15. M. Popova and S. L. Bretz, J. Chem. Educ., 2018, 95, 1086–1093. 16. G. Bhattacharyya and G. M. Bodner, J. Chem. Educ., 2005, 82, 1402. 17. R. Ferguson and G. M. Bodner, Chem. Educ. Res. Pract., 2008, 9, 102–113. 18. N. P. Grove, M. M. Cooper and K. M. Rush, J. Chem. Educ., 2012, 89, 844–849. 19. A. M. Strickland, A. Kraft and G. Bhattacharyya, Chem. Educ. Res. Pract., 2010, 11, 293–301. 20. M. N. Petterson, F. M. Watts, E. P. Snyder-White, S. R. Archer, G. V. Shultz and S. A. Finkenstaedt-Quinn, Chem. Educ. Res. Pract., 2020, 21, 878–892. 21. D. Xue and M. Stains, J. Chem. Educ., 2020, 97, 894–902. 22. N. Graulich, Chem. Educ. Res. Pract., 2015, 16, 9–21. 23. R. Kozma and J. Russell, Models and Modeling in Science Education, ed.  J. K. Gilbert, Springer, Dordrecht, 2005, pp. 121–145. 24. A. B. Flynn and R. B. Featherstone, Chem. Educ. Res. Pract., 2017, 18, 64–77. 25. V. DeCocq and G. Bhattacharyya, Chem. Educ. Res. Pract., 2019, 20, 213–228. 26. V. M. Williamson, M. Hegarty, G. Deslongchamps, K. C. Williamson and M. J. Shultz, J. Chem. Educ., 2013, 90, 159–164. 27. M. Stieff, M. Hegarty and G. Deslongchamps, Cognition and Instruction, 2011, vol. 29, pp. 123–145. 28. S. Cullipher and H. Sevian, J. Chem. Educ., 2015, 92, 1996–2005. 29. J. J. Topczewski, A. M. Topczewski, H. Tang, L. K. Kendhammer and N. J. Pienta, J. Chem. Educ., 2017, 94, 29–37. 30. M. Rodemer, J. Eckhard, N. Graulich and S. Bernholt, J. Chem. Educ., 2020, 97, 3530–3539. 31. U. Hahn and N. Chater, Cognition, 1998, 65, 197–230. 32. H. Sevian, S. Bernholt, G. A. Szteinberg, S. Auguste and L. C. Pérez, Chem. Educ. Res. Pract., 2015, 16, 429–446. 33. I. Caspari, M. L. Weinrich, H. Sevian and N. Graulich, Chem. Educ. Res. Pract., 2018, 19, 42–59. 34. L. Darden, Philos. Sci., 2002, 69, S354–S365. 35. M. A. Just and P. A. Carpenter, Cogn. Psychol., 1976, 8, 441–480. 36. J. E. Hoffman and B. Subramaniam, Percept. Psychophys, 1995, 57, 787–795. 37. J. R. VandenPlas, S. J. R. Hansen and S. Cullipher, Eye Tracking for the Chemistry Education Researcher, American Chemical Society, Washington, DC, 2018. 38. K. Holmqvist, M. Nyström, R. Andersson, R. Dewhurst, H. Jarodzka and J. V. D. Weijer, Eye Tracking: A Comprehensive Guide to Methods and Measures, OUP Oxford, 2011. 39. V. Braun and V. Clarke, Qual. Res. Psychol., 2006, 3, 77–101. 40. N. Graulich, S. Hedtrich and R. Harzenetter, Chem. Educ. Res. Pract., 2019, 20, 924–936.

Chapter 2

Supporting Spatial Thinking in Organic Chemistry Through Augmented Reality—An Explorative Interview Study Sebastian Keller*a and Sebastian Habigb a

University Duisburg-Essen, Chemistry Education, Schuetzenbahn 70, 45127 Essen, Germany; bFriedrich-Alexander-Universität Erlangen-Nürnberg, Chemistry Education, Regensburgerstr. 160, 90478 Nuremberg, Germany *E-mail: [email protected]

2.1  Introduction 2.1.1  M  ultiple External Representations in Organic Chemistry Learning Organic chemistry contains a high number of different substance classes, compounds, and specific reactions due to the special bonding conditions of the element carbon. As in all science related disciplines, communication on a professional, as well as on an educational level is based on various representations and conventions.1–4 Examples for typical pictorial representations to depict organic chemical information are wedge-dash notations, valence structural formula, ball-and-stick models, or energy diagrams, etc. However, these pictorial representations are not sufficient to promote   Advances in Chemistry Education Series No. 10 Student Reasoning in Organic Chemistry: Research Advances and Evidence-based Instructional Practices Edited by Nicole Graulich and Ginger Shultz © The Royal Society of Chemistry 2023 Published by the Royal Society of Chemistry, www.rsc.org

19

20

Chapter 2

conceptual understanding. Therefore, they are usually combined with textual information in instructional designs.5–7 This combination of so-called multiple external representations has proven to be effective to convey abstract concepts and/or complex cause and effect systems, which are typical in organic chemistry.8,9,22 During their learning process, learners have to transform external representations, e.g. from text and pictures, into a coherent mental model.10 Mayer’s Cognitive Theory of Multimedia Learning assumes that visual and verbal information are processed via different cognitive channels in working memory.11 As the working memory’s capacity is limited, the different cognitive channels can only process a limited amount of information simultaneously.7,10 Mayer also points out that learners need to acquire the specific learning subject actively to initiate the learning process. Fulfilling all these conditions, a successful learning process can occur if a learner selects relevant information to build their own mental model and finally integrate this within existing prior knowledge.11 The construction of a coherent mental model is particularly demanding when a new topic is to be learned with little prior knowledge. The fact that educators often expect learners to use subject-specific visualizations to learn new concepts, without learners having the competencies to use them, is what Rau describes as the representational dilemma.4 Remembering that a wide range of representations is used to illustrate chemical aspects, this representation dilemma is critical and might hinder successful chemistry learning.2,4,12 The wide range of representations in organic chemistry allows to illustrate chemical phenomena on different levels of abstraction. What offers professionals to focus on the view of interest can overwhelm novices quickly. To benefit from the huge variety of representations, learners need sophisticated representational competencies to see interconnections in-between.3,13–15 They need to be able to transform them into each other and understand the illustrated chemical relationships.3,13,14 As all organic chemical molecules and reactions appear in three-dimensional geometries, corresponding representations must also express those geometries and the related spatial characteristics. To make sense of these representations learners must cognitively process them in three-dimensional space and identify the molecules spatial characteristics to compare them with equal two-dimensional representations. Furthermore, mental rotation abilities as a component of spatial thinking have proven to be a predictor for learning success in organic chemistry.13,15 Therefore, besides representational competencies, learners also need sophisticated spatial abilities for the successful comprehension of organic chemistry representations.2,3 Research shows that students with high spatial abilities outperform students with low spatial abilities when solving organic chemistry tasks.16–18 Furthermore, they are more often able to explain their path of solution verbally and in more detail.16–18 It shows that spatial ability, which can be developed over time by appropriate training and support,13,19–21 is a key factor for learning success in organic chemistry.

Supporting Spatial Thinking in Organic Chemistry Through Augmented Reality

21

2.1.2  Spatial Reasoning in Organic Chemistry Due to the huge variety of molecules and reactions in organic chemistry, it is neither possible nor necessary to learn every single reaction separately. Molecules and reactions with similar chemical properties or performances can be summarized in different categories. Those characteristics must be learned only for the specific category and not for every molecule or reaction solely. If learners develop this kind of conceptual knowledge, they can apply this knowledge to nearly all molecules and reactions of this category.22 Graulich and Schween point out that the acquisition of conceptual understanding seems to be one of the main challenges students face when elaborating organic chemistry, because learners often perceive organic chemistry as a huge collection of single cases.23–25 Research also shows, that learners often consider the static structure of a molecule without thinking of the dynamic process taking place during a chemical reaction.26 Graulich and Schween’s aim is to let learners actively experience how conceptual reasoning is independent of a given molecular structure or reaction type.23 A learner who has a broad conceptual understanding of a specific topic, can reason, and argue about a given task or problem from this context. Lieber and Graulich denote the building of arguments in the field of organic chemistry as a central ability for students to make well-grounded decisions.28 According to Toulmin’s claim-evidence-and-reasoning model, reasoning serves as the anchor to connect evidence and claim, also supported by prior knowledge.29 Individual reasoning represents how learners justify the connection between claim and evidence.29,30 In an educational context it is important for educators to explore how learners build arguments by combining given information and prior knowledge to generate explanations, make predictions and form a decision.31 By following the learner’s path of reasoning, it is possible to identify far-reaching misconceptions and to get a chance to replace them at an early stage.27,32 The degree of detail of a learners reasoning allows conclusions of how comprehensive the underlying conceptual understanding is.27 Hereby it can also become clear whether learners use key words or catchphrases as part of a larger conceptual understanding or only from a limited perspective.27,33 In the context of organic chemistry learning, the abstract and demanding contents do not have to be elaborated themselves alone but must be combined with representational competencies to comprehend the huge variety of chemical representations and also with spatial information.2,13 All these components must be considered in a sufficient reasoning. Hegarty and Waller consider spatial problem solving in science not as a process based on only one kind of problem solving, but a hybrid reasoning process.18 This assumption is in line with study results, showing individuals using imagistic spatial reasoning often combined with alternative problem-solving strategies.18 Which kind of problem-solving strategy a learner uses can be traced in an interview situation when the learner solves tasks using the think-aloud method. In doing so, educators can follow the explanatory path to the

22

Chapter 2

solution and recognize which arguments have been incorporated in the solution to which extend. For example, if a learner has verbalized spatial orientation or mental rotation, they have included these spatial aspects in their reasoning (among other things),34 for example to predict the conformation of a product after a chemical reaction. Not surprisingly, numerous authors state that learners’ success in organic chemistry is often linked with their ability to build arguments and to reason about reaction mechanisms.35–37 As already mentioned, to be able to comprehend organic chemistry concepts and to reason about them, sophisticated spatial ability is needed. Therefore, topic-specific spatial abilities should be fostered during learning. For this purpose, the use of modern digital tools offers itself, and will be described in the following section.

2.2  A  ugmented Reality as an Instructional Aid in Organic Chemistry Educators should pay attention to the difficulties and challenges when teaching organic chemistry. To support learning, the use of animations or simulations has already been investigated. Used appropriately, they have proven to help students gaining a deeper understanding of concepts and the spatial relations when compared to traditional analogue representations.8,15,38,39 Besides dynamic content like animations, static virtual content can be supportive while learning chemistry. Midak and colleagues stated that using three-dimensional (3D) images of chemical molecules while teaching organic chemistry gives learners a better understanding of the molecules geometry, its compound structure and/or the prediction of the products of chemical reactions.9 As good as virtual 3D content might be designed, it is not able to convey any chemical concept solely and must be combined with other instructional materials. It is conceivable to offer virtual 3D content on a separate device alongside text and picture-based printed learning materials. In this case, learners would need to switch their attention constantly back and forth between the instructional material and the device, which would undergo the spatial contiguity principle of Mayer’s Cognitive Theory of Multimedia Learning.11 This so-called splitattention effect would burden the valuable capacity of the working memory and therefore aggravate learning.40,41 A more desirable solution would be to integrate the virtual 3D content seamlessly within the traditional learning setting. One possible way to achieve this integration is by utilizing Augmented Reality (AR). According to Azuma, an AR tool is characterized by three factors.42 First, 3D virtual objects are presented on a screen. Second, the user can interact with these virtual objects. Third, these virtual elements are embedded within the real world. According to Milgram’s Virtuality Continuum, the surrounding world should not be completely blocked out, as it is the case when using virtual reality tools.43 There are several ways to offer AR technically, but the

Supporting Spatial Thinking in Organic Chemistry Through Augmented Reality

23

most commonly used way is the so-called marker-based AR. A marker is the trigger point to start the AR. If a device’s camera detects the pre-defined marker, it triggers pre-defined content or an action. In educational settings, usually two-dimensional (2D) printed images in books or other instructional materials serve as markers. Regarding Azuma’s AR requirements and Mayer’s spatial contiguity principle, this brings the advantage that the virtual 3D content appears directly “above” a related 2D representation. An example can be seen in the Sample and Design section. The progressing digitization, which crosses all areas of life, comes with widespread and powerful mobile devices. Each of these extensively equipped devices like smartphones or tablets can also serve as a technical platform to operate an AR tool.41,44 The advantages of cost effectiveness, independent operability, and easy and quick operation are convincing, especially in educational contexts.42,44–46 Several studies have investigated the benefits and potential of AR-based learning. The majority of the reported AR tools are designed to be used on the introductory stage of educational settings.10,40 These works share the common aim to aid learners in acquiring new knowledge and generalize specific concepts.10,14,47 As AR enables us to present virtual 3D objects on screen, learners can recognize phenomena that would be usually invisible or infeasible.5 The presentation of animations, e.g. to simulate a process, is gratefully accepted by learners and educators, because dynamic interactions of several objects can be perceived and comprehended easily.8,15,38,39 Also, offering static virtual objects can be a helpful contribution to support learning. By finger movement on screen, the user can interact with these objects and manipulate them, like scaling or rotating them freely in space.45 The effects of AR support on learning have been evaluated for different domains. Studies report that AR-supported learners achieved greater learning success when compared to equivalent traditional learning settings.14,40,41,48 These effects become clearly visible for low-performing students when compared to high-performing students.14 Besides increased learning gains, affective variables also seem to be influenced by AR use. Learners reported higher learning motivation and a more positive attitude towards the learning subject.5,40,49 Furthermore, Ajit and colleagues assumed that the increased interest might be a result of the motivating interaction options on the AR.49 With regard to organic chemistry learning and the exploration of such abstract concepts, it should also be mentioned that researchers have found out that AR-based learning also improved the learners mental rotation abilities and their awareness for spatial relations.41,50–52 Having sophisticated mental rotation abilities is an important aspect to facilitate organic chemistry learning. These learners do not have to invest larger amounts of their limited cognitive capacity in the mental rotation operations.46,53 Finally, the integration of virtual 3D elements in the learning setting, which aids spatial abilities, supports a deeper understanding of abstract concepts and the memorization of procedural knowledge.14,49

24

Chapter 2

2.3  Aim of the Study The findings in educational AR research seem very promising in supporting organic chemistry learning. However, a question remains open as to whether the AR-supported learning of organic chemistry topics supports learners in their spatial understanding of those topics. Fostering spatial abilities could also be achieved by using chemical model kits.55 However, these have the disadvantage that the models have to be assembled for every single example and only offer static objects. In contrast, AR tools might save valuable learning time as the content, necessary for the specific task, appears without any time lag. Furthermore, learners can also perceive dynamic aspects by using AR, which is very important for organic chemistry learning to convey dynamic processes like chemical reactions. A learner with a detailed spatial understanding of chemical aspects should be able to reason more spatially targeted and spatially detailed when solving specific problems when compared to learners without additional support. Therefore, we address the following research question.    To what extent do learners use spatial information to complete organic chemistry tasks when they have learned using AR support in contrast to a comparison group?   

2.4  Sample and Design To answer the research question, we conducted an experiment with 22 second-semester chemistry students of the chemistry teacher-training program at a German university. The mean students’ age was M = 21.36 (SD = 2.75) and the gender distribution was 11 male and 11 female students. To evaluate the effects of AR-supported learning in contrast to traditional learning, we split the students into two groups, considering equivalent gender proportions— an AR group and a non-AR group. At first, the students worked on 12 items of the PURDUE visualization of rotation test (PSVT:R) to measure their mental rotation abilities.54 Afterwards, the working phase started in which the students worked on a learning material. The learning material consisted of text and pictures and discussed two subtopics of organic chemistry, namely stereochemistry and pericyclic reactions. These two subtopics were chosen with care regarding the research question because both demand for sophisticated mental rotation abilities and additionally pericyclic reactions deal with abstract concepts and spatially complex reactions the students usually did not meet before. In addition to the learning material, the learners of the AR group were invited to use our AR app, called Augmented Reality Chemistry (ARC),53 while the learners of the non-AR group did not receive any supplemental materials like the ARC app or a model kit. The app was provided on an Apple iPad® (6th generation). By pointing the camera to a printed marker in the learning material, a virtual object appeared on screen. The app contained

Supporting Spatial Thinking in Organic Chemistry Through Augmented Reality

25

Figure 2.1  Example  of the ARC app in use (rotatable 3D model).

12 rotatable virtual 3D objects and 17 3D animations. Figure 2.1 illustrates the ARC app in use combined with the paper-based learning material. During the working phase there were no time constrains, so each learner could have worked on the learning material at his or her preferred speed and the AR group learners could use the virtual content as often as they liked to. Directly after the intervention, we conducted semi-structured interviews with all learners. After a few demographic questions, four tasks followed that the students should solve by thinking-aloud. These tasks leaned on the topics discussed in working phase right before. All necessary knowledge for successful task solving was included in the learning material.    ●● In task one, a model of a butane molecule was presented in the so-called wedge-dash notation. Students were tasked with rotating the molecule around its bonding axis of 180° and then choose the correct molecule from four offered molecules in Newman projection. ●● Task two was similar, where the students were tasked with rotating a wedge-dash notation around its bonding axis of 120°. This time they were asked to draw the correct resulting molecule in Newman projection. ●● In task three, two tetrahedral ball-and-stick models with four different colored balls were presented. Whilst regarding the chemical concept of enantiomers and diastereomers, the students were asked to answer whether these two models were identical or not and to explain their solution path. This task is shown in Figure 2.2. ●● Task four presented two educts to perform a pericyclic reaction. The learners were asked to determine the product conformation based on rotation operations of the educts and select the correct one out of four options.   

Chapter 2

26

Figure 2.2  Task  3: Translation between two ball-and-stick models. Translated from German.

When working on the four tasks, no supplemental materials (e.g., the learning material or the ARC app) were permitted. With dedicated enquiries, the interviewees were encouraged to explain as much of their way to solution as well as of spatial aspects. As well as the demographic questions and the four tasks, we wanted to measure the experience of the AR group with the ARC app. Those learners were asked to talk about their general impression, whether the app was helpful or not and if they could suggest any improvements. On average the interviews lasted 27:07 minutes (minimum 16:07 minutes and maximum 56:09 minutes). All the interviews were conducted with each student individually by the same interviewer. Later, the interviews were transcribed in Microsoft® Word and analyzed in a qualitative content analysis using the software MaxQDA. The development of categories occurred inductively. Thereby it was desired to grasp especially spatial information and explanations in the students’ statements, like rotation angles, positions of atoms or different molecule planes.

2.5  Results First, the quantitative results of mental rotation abilities will be presented. In the PSVT:R, learners of the non-AR group reached a mean score of M = 61.36 (SD = 24.23) and learners of the AR group reached a mean score of M = 65.91 (SD = 15.57). An univariate analysis of variance identified the group difference in mental rotation abilities as not significant F(1,22) = 0.274, p = 0.606. After presenting the results regarding the PSVT:R, the evaluation of the semi-structured interviews will follow. Since the tasks vary in content and conceptual formulation, they will be presented separately. They all have in common that only selected categories from the qualitative content analysis will be mentioned, especially regarding the research question.

Supporting Spatial Thinking in Organic Chemistry Through Augmented Reality

27

Figure 2.3  Results  of task 1, frequencies of utterances separated between the groups.

2.5.1  T  ask 1—Translation Between a Dash-wedge Notation and a Newman Projection In task one, learners were asked to rotate a model of a butane molecule in a wedge-dash notation around its bonding axis of 180° and then select the correct molecule in Newman projection. Figure 2.3 shows the frequencies of the categories for both groups separately. AR learners’ utterances appear more often in spatially related categories for this task than utterances of the non-AR group. They tend to focus more often on the part of the molecule to be rotated and as well as the atom positions of the product also by mentioning the different molecule planes. Furthermore, the AR group learners uttered the correct torsion angle and its relation to a 360° circle more often. This leads further to correctly performed rotation operations more often when compared to the non-AR group. The AR learners’ final selections fit to their rationales they uttered before more often, compared to the non-AR learners. Three learners of the non-AR group selected answers that did not fit to their rationales and one learner of each group did not offer any rationale. Two non-AR learners did not utter any spatial aspects at all.

2.5.2  T  ask 2—Generating a Newman Projection from a Given Dash-wedge Notation Task two was like task one. A model of a butane molecule was given in wedgedash notation with students asked to rotate it around its bonding axis of 120°. Students were asked to draw the correct rotated molecule as a Newman

Chapter 2

28

Figure 2.4  Results  of Task 2, frequencies of utterances separated between the groups.

projection. Figure 2.4 represents the categories and their frequencies of appearance. The number of AR learners offering incorrect explanations of the static or the rotated plane as well as incorrect drawing is noticeably lower than the number for non-AR learners. For this task, learners who used AR support before were able to give a correct explanation and draw a correct sketch more often. Like the findings from task one, the AR group of learners seem to focus on spatial aspects during task solution, e.g., by mentioning the correct part to be rotated and the torsion angle with an awareness of a 360° circle more often than learners in the non-AR group did. However, mistakes like the reversing of molecule planes and incorrect drawings appeared more often in the non-AR group. Those learners only appear in the category of No spatial utterances at all.

2.5.3  Task 3—Translating Between Two Ball-and-stick Models The subject of task three was a tetrahedral ball-and-stick model consisting of four different colored balls around one central ball. An identical ball-andstick model was printed next to it after rotating around two axes. In relation to the chemical concept of enantiomers and diastereomers, students were asked to verify whether both models were identical or not. The typical way to prove the equivalence would be to rotate the model and check whether it can cover the other model completely. Exemplary student quotes can be found in Table 2.1. The learner with the ID AT17EMJO (left column) is a male student of the non-AR group. Already at an early stage, he determines that the models are

Table 2.1  Two  exemplary interview excerpts of Task 3. Translated from German. ID: AN21IRKU (AR group)

Student:

Student: “My first impression is that this is not mirrored. Therefore, … [Short waiting]. If I revolve it around itself of 180°, then the yellow ball is on the opposite side as it should be.” Interviewer: “Just to be able to follow you: You have chosen to rotate the left model?”

“Here you can see that the models are mirrored… so if you compare both… so this is my first assumption at a glance. I look at the left model at first. And now I would compare all possible opportunities in my mind, for example horizontally or vertically rotated. Then I check if it could be the same [Short waiting]. One moment please. I need to consider all the opportunities of mirroring in my mind and whether these make sense or not [Short waiting]. Yes, it is definitely mirrored. But the yellow ball is swapped with the red one… no, the red ball is swapped with the green ball.” Interviewer: “You mean the balls are swapped after mirroring?”

29

Student: “No, the right one. I keep the left model stationary. I try to rotate the model on the right around 180°. The yellow ball is on the left side now as it is also the case on the stationary model. The red ball comes to the front plane where previously the blue ball was located. The blue ball itself appears now in the rear plane and the green ball remains on the top. You got it?” Interviewer:  “Yes, I did.” Student: Student: “Yes. That means both models are definitely not identical if the mirror “This was the first step. And now if I rotate it to the front… principle is applied. But, if one would say that one model is mir[Short waiting]… then I would look like that it fits.” rored first and maybe rotated in a second step, then they could be identical possibly. But, if you only rely on to mirror the model than the models are not identical.” Interviewer: Interviewer: “You just talked about the option first to mirror and to rotate afterwards. “Could you please explain this second step for me in detail?” Could you please explain this idea in detail?” Student: Student: “Then the red ball would come to the top, the blue ball would “Well, I mean you can mirror the left model in a first step and rotate point out to the front plane and the green ball would hide in it afterwards, the result will be the same as if you only mirror it. So, the rear plane behind the central atom. The yellow ball would rotating the model does not change anything. Therefore, these modremain to the left side because this is the axis of the rotation. els are not identical.” So, these are identical.”

Supporting Spatial Thinking in Organic Chemistry Through Augmented Reality

ID: AT17EMJO (non-AR group)

Chapter 2

30

mirrored. In subsequent sentences, he expresses horizontal or vertical rotations, and he insinuates that balls are swapped particularly. He explains that it could be possible that one model is mirrored first and rotated in a second step. Upon request of the interviewer, the learner argues that a rotation step after mirroring a model does not change anything concerning the equality of both models. In his argumentation it stands out that he did not perform any rotation of the ball-and-stick models, therefore this utterance serves the category “No rotation performed”. The justification that both models are not identical is based on applying the wrong strategy of mirroring the models. Because of this and because he contradicts himself regarding the strategies, his utterances fit also in the categories “Incorrect solution strategy performed” as well as “Final selection incorrectly justified”. A second example comes from a male student of the AR group with the ID AN21IRKU (right column). At the very beginning, he excludes that the models could be mirrored. He separates his argumentation into two parts of separate rotation operations. Thereby, he explains in detail how he rotates the ball-and-stick model on the right side, and at which position which ball is located after each rotation step. He also utters that he only rotates one of the two models and keeps the other one stationary and also how he uses a specific axis as a fixed point to conduct the rotation. Because of this, his utterances fit in the category “fixed point or rotation axis mentioned” of the qualitative content analysis. Furthermore, he was able to perform the first as well as the second rotation step correctly. Finally, his whole argumentation applies to the category “Final selection correctly justified”. After presenting these exemplary student quotes, now the analyses of all participants shall be considered (Figure 2.5). It is remarkable that seven

Figure 2.5  Results  of Task 3, frequencies of utterances separated between the groups.

Supporting Spatial Thinking in Organic Chemistry Through Augmented Reality

31

learners of the non-AR group did not chose the way of rotation operations as well as two learners of the AR group. A comparable distribution of learners performed other solution strategies resulting in wrong findings. Making use of spatial problem solving appears in selecting a specific ball or a specific axis as a fixed point before conducting a rotation operation, which was uttered by more AR learners. The AR learners also performed the rotation operation correctly more often and finally were able to answer the task, based on the rotation operation, correctly more often, compared to the non-AR learners. It is conspicuous that the rotation steps of five AR learners were not traceable.

2.5.4  Task 4—Determine the Product Conformation The previous tasks are dedicated in content to the subtopic stereochemistry. Task four examines the learners’ abilities in the subtopic of pericyclic reactions. The task shows a diene as well as a dienophile in trans conformation and learners must select the correct resulting product out of four given options. During their explanations more AR learners, when compared to non-AR learners, were able to exclude incorrect answer options reasonably (Figure 2.6). Non-AR group learners uttered the direction of the dienophile accumulating towards the diene more often when compared to the AR group learners. Additionally, they mentioned different molecule planes more often. To solve the task, three learners of each group predicted the correct option by transferring the trans-conformation of the educt towards the product. Another applied way was to rotate diene and dienophile, which was done by one non-AR learner and two AR learners. Finally, four non-AR group learners and five AR group learners have chosen the correct result based on a previous

Figure 2.6  Results  of Task 4, frequencies of utterances separated between the groups.

32

Chapter 2

correct reaction while six learners of each group reasoned incorrectly and selected incorrect options. Spatial utterances were missing in the explanations of three non-AR and two AR learners.

2.6  Discussion After presenting the results, they will now be discussed and interpreted. As derived from the introduction, the focus of this contribution is to investigate whether AR-supported learning can help learners to include spatial aspects when reasoning during tasks from the field of organic chemistry. By considering the presented results, it stands out that independent of the task, learners of the AR group were more often able to solve tasks correctly based on a previous mentioned reasoning when compared to the non-AR group. As already stated in previous research concerning AR use,41,50–52 AR learners are more often able to conduct rotation operations correctly compared to learners without AR support. These two findings indicate that AR learners were supported by the AR use in their spatial understanding of the chemical concepts and tasks. This impression is reinforced when considering the frequency of spatial utterances, remembering that spatial abilities as part of conceptual understanding can only be used to reason when they are available. It is noticeable that learners of the AR group uttered spatial characteristics like the awareness of torsion degrees or molecule planes at least with equal frequencies as non-AR group learners did, often even higher. For example, in task one and two they described the rotation operations in detail unprompted more often than students who learned without AR support. It is remarkable in this context that this impression is the opposite for task three. Here, AR learners fit five times in the category Rotation steps not traceable while this applies only to one learner of the non-AR group. Nevertheless, as for the previous tasks, AR group learners were able to rotate the molecule correctly more often and finally did reason correctly more often compared to the non-AR group learners. An explanation for this finding could be the fact that, although they were asked to explain aloud, the rotation steps were already internalized by solving the by now third task involving a rotation operation. Consequently, they were faster with a lower cognitive demand needed in rotating mentally without uttering verbal explanations.46,53 In this context, it must be pointed out that the students of both groups did not differ in the mental rotation abilities before the intervention. Therefore, it is unlikely that the identified findings result from differences in students’ preconditions regarding mental rotation but might through AR use.41,50–52 The impression of a fostered spatial understanding through AR use is underlined by another indication. When looking across all tasks it stands out that AR learners seem to be able to make use of spatial related solution strategies more often when compared to the non-AR group.22,27,29 This could be to focus only on the part of the molecule to be rotated and ignore the other part or focus more on the product than on the educt. Another interesting

Supporting Spatial Thinking in Organic Chemistry Through Augmented Reality

33

example for applying spatial related strategies comes from task three, where eight out of eleven AR learners mentioned to select first one ball or an axis of the molecule as a fixed point and then conduct the rotation operations. Equivalent utterances of non-AR learners are far fewer. This indicates that learners who can define for themselves this kind of self-selected assistance have a more spatial understanding of rotation operations when compared to learners who do not use such strategies.22,23,27–29 One notable limitation of the experiment is that the formation of the categories, as well as the rating of the utterances, were only conducted by one person. A second rating could not be realized. Furthermore, it must be kept in mind that only statements that learners uttered could be analyzed. If learners restrained aspects even if they were asked to explain in detail, these pieces of knowledge could obviously not be captured. It would be interesting to extend the experiment to other subtopics and tasks within organic chemistry. In addition, it would be interesting to repeat the test on mental rotation abilities after a certain time to gain follow-up insights of how supportive the AR use is for chemistry learners. From this study, implications for organic chemistry teaching in universities as well as in high schools can be derived. As teachers and educators face the heterogeneous spatial abilities of their students, AR can be a useful addition to the teachers’ media toolbox providing multiple approaches to learning organic chemistry. However, although AR is quite easy to use, it is advisable for educators to prepare some exemplary markers to familiarize all learners with the technology at the very beginning of the lecture. To face the mentioned difficulties when learning organic chemistry, also several analogue spatial ability trainings are available. Contrasting these analogue trainings with AR support reveals new research approaches. It would be interesting to analyze whether one training effect might improve the learners’ spatial abilities for a longer time than another one. As learning time is a very valuable asset in education, the question arises if an ARsupported learning might be more time efficient with equal success when compared to analogue training.

References 1. J. Lemke, in Reading Science: Critical and Functional Perspectives on Discourses of Science, ed. J. R. Martin and R. Vell, Routledge, New York, NY, USA, 1998, pp. 87–113. 2. H. K. Wu and P. Shah, Sci. Educ., 2004, 88, 465. 3. M. Harle and M. A. Towns, J. Chem. Educ., 2011, 88, 351. 4. M. A. Rau, Educ. Psychol. Rev., 2017, 29, 717. 5. B. L. Nielsen, H. Brandt and H. Swensen, Nord. Stud. Sci. Educ., 2016, 12, 157. 6. K. Altmeyer, S. Kapp, M. Thees, S. Malone, J. Kuhn and R. Brünken, Br. J. Educ. Technol., 2020, 51, 611.

34

Chapter 2

7. J. C. Castro-Alonso, P. Ayres and J. Sweller, in Visuospatial Processing for Education in Health and Natural Sciences, ed. J. C. Castro-Alonso, Springer, Cham, Switzerland, 2019, pp. 111–143. 8. S. B. Zahra, Effect of Visual 3D Animation in Education, Department of Computer Science, Lahore Garrison University, 2016. 9. L. Y. Midak, I. V. Kravets, O. V. Kuzyshyn, L. V. Baziuk and K. V. Buzhdyhan, J. Phys. Conf. Ser., 2021, 012013. 10. P. P. Nechypurenko, T. V. Starova, T. V. Selivanova, A. O. Tomilina and A. D. Uchitel, Proceedings of the 1st International Workshop on Augmented Reality in Education, 2018. 11. R. E. Mayer, The Cambridge Handbook of Multimedia Learning, Cambridge University Press, Cambridge, UK, 2014. 12. T. Dickmann, M. Opfermann, E. Dammann, M. Lang and S. Rumann, Chem. Educ. Res. Pract., 2019, 20, 804. 13. H. Tuckey, J. Selvaratnam and J. Bradley, J. Chem. Educ., 1991, 68, 460. 14. S. Cai, X. Wang and F. K. Chiang, Comput. Hum. Behav., 2014, 37, 31. 15. M. Abdinejad, B. Talaie, H. Qorbani and S. Dalili, J. Sci. Educ. Technol., 2021, 30, 1. 16. C. S. Carter, M. A. LaRussa and G. M. Bodner, J. Res. Sci. Teach., 1987, 24, 645. 17. J. R. Pribyl and G. M. Bodner, J. Res. Sci. Teach., 1987, 24, 229. 18. M. Hegarty and D. A. Waller, Intelligence, 2004, 32, 175. 19. J. L. Mohler, in Proceedings of the Eurographics, Eurographics, Lausanne, Switzerland, 2006, pp. 79–86. 20. M. Stieff, Learn. Instr., 2007, 17, 219. 21. M. Terlecki, N. Newcombe and M. Little, Appl. Cogn. Psychol., 2008, 22, 996. 22. T. Lombrozo, Trends Cogn. Sci., 2006, 10, 464. 23. N. Graulich and M. Schween, J. Chem. Educ., 2018, 95, 376. 24. N. Graulich, Chem. Educ. Res. Pract., 2015, 16, 9. 25. N. Graulich and I. Caspari, Chem. Teach. Int. Best Pract. Chem. Educ., 2020, 3, 19. 26. N. P. Grove, M. M. Cooper and E. L. Cox, J. Chem. Educ., 2012, 89, 850. 27. K. Bain, J. M. Rodriguez and M. H. Towns, J. Chem. Educ., 2019, 96, 2086. 28. L. Lieber and N. Graulich, Chem. Educ. Res. Pract., 2021, 23, 38. 29. S. E. Toulmin, The Uses of Argument, Updated Version, Cambridge University Press, Cambridge, 2003. 30. K. L. McNeill and J. Krajcik, Book Study Facilitator’s Guide: Supporting Grade 5–8 Students in Constructing Explanations in Science: The Claim, Evidence and Reasoning Framework for Talk and Writing, Pearson Allyn & Bacon, New York, 2012. 31. H. Sevian and V. Talanquer, Chem. Educ. Res. Pract., 2014, 15, 10. 32. L. Cooper, in Cognition and Representation, ed. S. Schiffer and S. Steele, Westview Press, Boulder CO, 1988, pp. 53–86. 33. R. Ferguson and G. M. Bodner, Chem. Educ. Res. Pract., 2008, 9, 102.

Supporting Spatial Thinking in Organic Chemistry Through Augmented Reality

35

34. S. F. Hornbuckle, L. Gobin and S. N. Thurman, Issues Educ. Res., 2013, 7, 45. 35. N. E. Bodé, J. M. Deng and A. B. Flynn, J. Chem. Educ., 2019, 96, 1068. 36. D. Cruz-Ramirez de Arellano and M. H. Towns, Chem. Educ. Res. Pract., 2014, 15, 501. 37. J. M. Deng and A. B. Flynn, Chem. Educ. Res. Pract., 2021, 22, 749. 38. D. A. Falvo, Int. J. Technol. Teach. Learn., 2008, 4, 68. 39. M. Jancheski, 8th Conference on Informatics and Information Technology with International Participation. 2011, pp. 177–181. 40. J. Bacca, S. Baldiris, R. Fabregat and S. Graf, Educ. Technol. Soc., 2014, 17, 133. 41. M. Ibáñez and C. Delgado-Kloos, Comput. Educ., 2018, 123, 109. 42. R. T. Azuma, Teleop. Virt. Environ., 1997, 6, 355. 43. P. Milgram and F. Kishino, Trans. Inf. Syst., 1994, 12, 1321. 44. T. Suselo, B. Wünsche and A. Luxton-Reilly, Proceedings of the 52nd ACM Technical Symposium on Computer Science Education, 2021, p. 872. 45. M. Núñez, R. Quirós, I. Núñez, J. B. Carda and E. Camahort, Proceedings of the EE’08, 5th WSEAS/IASME International Conference on Engineering Education, Heraklion, Greece, 2008, pp. 271–277. 46. S. Sheikh, M. Belal, M. Heyat, O. Alshorman and M. Masadeh, Innovation and New Trends in Engineering, Science and Technology Education Conference, 2021, pp. 1–6. 47. K. R. Bujak, I. Radu, R. Catrambone, B. Macintyre, R. Zheng and G. Golubski, Comput. Educ., 2013, 68, 536. 48. S. Cai, E. Liu, Y. Yang and J. C. Liang, Br. J. Educ. Technol., 2019, 50, 248. 49. G. Ajit, T. Lucas and L. Kanyan, Appl. Econ., 2021, 39, 1. 50. E. Woods, M. Billinghurst, J. Looser, G. Aldridge, D. Brown, B. Garrie, and C. Nelles, Proceedings of the 2nd International Conference on Computer Graphics and Interactive Techniques in Australasia and South East Asia, Association for Computing Machinery, New York, 2004, pp. 230–236. 51. C. Carbonell-Carrera and J. L. Saorin, Eurasia J. Math. Sci. Technol. Educ., 2018, 14, 709. 52. J. Kotlarek, I. C. Lin and K. L. Ma, Proc. Symp. Spat. User Interact., 2018, 79–88. 53. S. Keller, S. Rumann and S. Habig, Information., 2021, 12, 96. 54. G. M. Bodner and R. B. Guay, J. Chem. Educ., 1997, 2, 1. 55. M. Stieff, S. Scopelitis, M. E. Lira and D. Desutter, Sci. Educ., 2016, 100, 344.

Chapter 3

Representational Competence Under the Magnifying Glass— The Interplay Between Student Reasoning Skills, Conceptual Understanding, and the Nature of Representations† LYNIESHA W. Warda, FRIDAH Roticha, JULIA Hoanga and MAIA Popova*a a

Department of Chemistry and Biochemistry, University of North Carolina at Greensboro, Greensboro, North Carolina, USA *E-mail: [email protected]

3.1  Introduction 3.1.1  T  he Role of Representational Competence in Organic Chemistry Chemists routinely use a wide variety of representations (e.g., two-dimensional (2D) graphs and diagrams, concrete ball-and-stick models, symbolic equations, etc.) to communicate about the invisible. While experts are adept at †

Electronic supplementary information (ESI) available: See Figures S3.1–S3.4. See DOI: 10.1039/9781839167782

  Advances in Chemistry Education Series No. 10 Student Reasoning in Organic Chemistry: Research Advances and Evidence-based Instructional Practices Edited by Nicole Graulich and Ginger Shultz © The Royal Society of Chemistry 2023 Published by the Royal Society of Chemistry, www.rsc.org

36

Representational Competence Under the Magnifying Glass

37

using representations to understand and communicate chemical phenomena, research suggests that novices experience difficulties while learning with representations.1 To extract meaningful information, learners need to understand both, the format of each representation, including the subtleties of the representations and the conventions for interpreting them, as well as the abstract chemical concepts that are encoded in the representations.2 Therefore, learners must develop representational competence (RC)—the ability to utilize representations to think about, communicate, and create meaning for a phenomenon.1 Kozma and Russell synthesized findings from studies investigating differences in expert–novice practices1,3 and proposed a set of concrete skills that comprise RC. This chapter uses the RC framework as a guide to investigate how students reason with dash-wedge diagrams (DWDs) and Newman projections (NPs). Although NPs are often taught in tandem with DWDs, students' difficulties with understanding these representations can stem from the fact that the conventions associated with each representation are quite different. DWDs effectively depict three-dimensional information by utilizing dashes and wedges—bonds going into the plane of the page and bonds coming out of the plane of the page, respectively. NPs provide an alternative view of a molecule with the focus on the spatial relationships between substituents on adjacent carbon atoms. NPs highlight that with rotation along a carbon–carbon sigma bond, staggered and eclipsed conformers of various stabilities are obtained. Previous research on student understanding of DWDs and NPs focused primarily on student spatial ability—the ability to generate, manipulate, and analyze mental images that are generated in the brain when interpreting spatial information embedded in representations.4 While several studies investigated student spatial ability and strategy use (e.g., spatial–imagistic vs. algorithmic–diagrammatic) when translating between DWDs and NPs,5–8 none of these studies focused specifically on student RC skills. Thus, previous research provided some insight into student ability to translate between DWDs and NPs, but little is known about other RC skills. In this study, we analyze student performance on several tasks involving DWDs and NPs that require various RC skills. These target skills include the ability to (a) interpret representations, (b) translate across these representations, (c) generate representations, and (d) use representations to make inferences about chemical phenomena. To support deeper analyses of student reasoning while completing these tasks, we employ Schönborn and Anderson's model to illustrate how students reason with the external features of DWDs and NPs and the conceptual information embedded within them.9

3.1.2  T  he Interplay Between the Nature of Representations, Conceptual Understanding, and Reasoning Schönborn and Anderson designed an empirical model of factors determining students' ability to interpret external representations.9 The model describes three main factors: mode (M)—external features of a representation; conceptual (C)—students' understanding of the concepts of relevance to

Chapter 3

38

a representation; and reasoning (R)—cognitive processes and skills needed to either perceive and decode external features of a representation or to access and retrieve conceptual information relevant to a representation (see Figure 3.1). The interaction between each of the three main factors results in four additional factors: reasoning–conceptual (R–C)— student’s conceptions representing cognitive processes students employ as they access, select, and retrieve conceptual knowledge of relevance to the representation; reasoning–mode (R–M)—student's ability to perceive and decode explicit features of a representation; conceptual–mode (C–M)—embodying the propositional knowledge communicated through the external features of the representation; and conceptual–reasoning–mode (C–R–M)—engagement of all three factors, that is, the student's ability to interpret a representation by linking their conceptual knowledge to the external features of the representation when decoding information communicated by the representation.9 C, M, and C–M are factors established by the scientific community (i.e., scientifically normative explanations of concepts, interpretations of representational features, or extraction of conceptual information embedded in representations). The R factor cannot occur without something to reason with (in this case, concepts or features of a representation). R–C, R–M, and C–R–M are factors that can be elicited directly from student data (i.e., students' reasoning about concepts, representational features, and conceptual information embedded in representations). Given that this study focused on student reasoning with representations, our analysis primarily captured the R–M and C–R–M factors. The model has been previously expressed in the context of biochemistry with representations of antibody–antigen interactions.9 Our study adapts this

Figure 3.1  C–R–M  model of factors determining a student's ability to interpret external representations.9

Representational Competence Under the Magnifying Glass

39

model to the novel context of organic chemistry representations. Additionally, the model has been positioned as explaining students' ability to interpret representations. We expand the applications of this model to explain how students reason when engaging in tasks that require additional RC skills. Specifically, this study answers the following research questions:    1. How do organic chemistry students reason while interpreting, translating, using, and generating dash-wedge diagrams and Newman projections? 2. How does student reasoning compare with their ability to accurately complete tasks associated with interpreting, translating, using, and generating dash-wedge diagrams and Newman projections?

3.2  Study Design and Methods We designed a semi-structured think-aloud interview protocol to elicit student reasoning as they completed various tasks targeting DWDs and NPs. These tasks were devised to capture how students (a) interpret DWDs and NPs, (b) translate from DWDs to NPs and from NPs to DWDs, (c) use NPs to make inferences about stability, and (d) generate a NP when given a DWD. A more detailed description of each task, including students' performances across each task, are presented in the findings section. Semi-structured think-aloud interviews were conducted in-person in early Spring 2020 with ten students who completed organic chemistry I in Fall 2019. Think-aloud interviews were ideal for this study as they enabled us to ask probing questions to capture how the participants reasoned while engaging with each representation and completing the tasks.10,11 The interviews lasted approximately one hour. An audio and video recorder were used to document students' verbal explanations and gestures. A Livescribe Smartpen was used to capture students' writing and drawings.12 Each student received a $20 gift card as compensation for participating in this Institutional Review Board-approved study. Pseudonyms were assigned for each participant to protect their identities. The interviews were transcribed verbatim, and the transcripts were augmented with students' drawings. The transcripts were coded both inductively and deductively. Inductive coding was used to derive patterns directly from the data. We utilized in vivo and constructed codes to capture and represent students' ideas.13 Furthermore, we used axial coding and constant-comparative analysis to evaluate each code and build hierarchies of codes and categories.13 Deductive coding was used to code each student's statement as either reasoning related to the mode of a representation (R–M factor) or reasoning about the conceptual information embedded in the mode of a representation (C–R–M factor). The second author collected the data. The first and the third authors coded the data and, depending on a code, obtained 84–100% agreement.14 Furthermore, they discussed each case of disagreement in their coding until a 100% negotiated agreement was reached. In addition, all four

40

Chapter 3

authors met weekly to discuss and revise the codes and analyze the data for patterns. Peer debriefing, investigator triangulation, code–recode strategy, negative case analysis, and memo writing were the methodological strategies employed to ensure the credibility, dependability, and confirmability of our findings.15,16 We present our findings as a case study of three representative students to provide insight into how students reason with DWDs and NPs.

3.3  Findings Our analysis revealed two types of reasoning units: reasoning strategies and rule-based reasoning patterns. We define reasoning strategies as fine-grained approaches used to solve tasks targeting various representations. For example, while translating between DWDs and NPs, the students in our study used previously reported strategies (in the context of reasoning about skeletal structures), such as ‘counting atoms’ or ‘mapping atoms’.17,18 We also observed students using the ‘spatial orientation’ strategy, also known as ‘perspective-taking’, where students use an egocentric frame of reference to view a structure from a different perspective (previously reported in the context of students translating from Fisher projections or dash-wedge diagrams to NPs).6,19 Simultaneously, we captured novel reasoning strategies such as ‘mapping connectivity’, where students evaluate if the atoms or functional groups in one representation are linked in the same way as in another representation. Each reasoning strategy can be used appropriately or inappropriately. If a participant reasoned appropriately, they applied the strategy accurately, and their statements were scientifically sound and relevant in the context of the discussed representation. Inappropriate reasoning is either inaccurate strategy usage, scientifically unsound statements, or inapplicable statements in the context of a specific representation. Contrastingly, rule-based reasoning patterns are larger-grain rigid ideas about the features of representations. For example, some students in our study held the idea that substituents on the front carbon in a NP must be on a wedge in a DWD. Unlike reasoning strategies, rule-based reasoning patterns are either inappropriate or conditional (appropriate only in certain contexts). Given that this study focused on student reasoning with representations, our analysis primarily captured the R–M and C–R–M factors. Specifically, all tasks elicited R–M reasoning, whereas some also elicited C–R–M reasoning (e.g., tasks asking students to interpret and use NPs and DWDs). Thus, like other studies,20 we found that the nature of the tasks had an impact on student reasoning. We begin by presenting an overview of students' performance across the tasks designed to capture students' RC skills with DWDs and NPs, as illustrated by the Sankey diagram (see Figure 3.2). The Sankey diagram shows that the appropriateness of students' reasoning varied with each representation. For example, more students demonstrated appropriate reasoning while interpreting NPs than DWDs. Student reasoning also varied across translations; more students exhibited appropriate reasoning when translating from a NP to a DWD than when translating in the reverse direction.

Representational Competence Under the Magnifying Glass

41

Figure 3.2  Sankey  diagram illustrating how participants reasoned across different tasks involving Newman projections and dash-wedge diagrams.

Overall, the appropriateness of each student's reasoning was somewhat consistent across the representations and tasks. The following analysis is presented as a case study to illuminate how students reason by using contrasting comparisons.21 We selected Dee, Carl, and Crystal as representative cases to demonstrate the variation within student reasoning: Dee represents participants who reasoned appropriately across most of the tasks; Carl represents participants who reasoned appropriately occasionally; Crystal represents participants who primarily exhibited inappropriate reasoning.

3.3.1  S  tudents' Reasoning While Interpreting Dash-wedge Diagrams and Newman Projections 3.3.1.1 Description of the Interpretation Tasks Participants were presented with a DWD (see online supplemental Figure S3.1) and asked to (a) describe what the representation communicates, (b) interpret each external feature of the representation (e.g., dashes, wedges, lines, symbols, etc.), (c) explain the purpose of the representation, and (d) construct a molecular formula for the given DWD. The latter task enabled us to evaluate if the students were able to interpret the implicit atoms in DWDs. This process was repeated with a NP. Below we outline how the students reasoned about the external features of these representations (R–M factor) and the conceptual information embedded in the modes of these representations (C–R–M factor).

3.3.1.2 Interpreting Dash-wedge Diagrams All three participants demonstrated some inappropriate reasoning while interpreting DWDs. Though Dee, Carl, and Crystal were able to describe the spatial relations between the external features (i.e., bonds going into the

42

Chapter 3

plane of the page and bonds coming out of the plane of the page), neither could identify the correct number of hydrogens in the DWDs. This demonstrates students' difficulties with reasoning about the mode of the representation, specifically the implicit atoms in DWDs (R–M factor). At the same time, all three participants described appropriate conceptual information embedded in DWDs (C–R–M factor). For example, Dee stated that these diagrams are effective for determining the stereochemistry of molecules, Carl stated that DWDs can be used to determine molecular geometry, and Crystal described the geometry and dynamic nature of cyclic DWDs (see Table 3.1).

3.3.1.3 Interpreting Newman Projections More participants were successful with interpreting the external features of the NPs than the DWDs (R–M factor). While counting the atoms and describing structural and spatial relations, Dee labelled the carbons in the NP and drew a skeletal structure to support her interpretation (see Table 3.1). This enabled her to count all the atoms in the context of a more familiar representation. Carl was able to reason appropriately about the number of atoms and the structural and spatial relations in NPs, without translating the NP into another representation. Crystal still struggled to count the hydrogen atoms, even though the hydrogen atoms in the NP were explicit. However, she was able to recognize the remaining structural and spatial characteristics of the NPs. Additionally, all participants could describe that the purpose of NPs is to view a molecule from another perspective. However, only Carl also stated that the NPs communicate information about the dynamic nature of molecules (C–R–M factor) (see Table 3.1).

3.3.2  S  tudents' Reasoning While Translating Between Dashwedge Diagrams and Newman Projections 3.3.2.1 Description of the Translation Tasks There were four tasks to translate from a DWD to a NP (see online supplemental Figure S3.2) and three tasks to translate from a NP to a DWD (see online supplemental Figure S3.3). We presented participants with a DWD and asked them to select corresponding NP(s) from three options. We repeated this process with a NP and asked students to select the corresponding DWD. The participants were informed that there might be more than one corresponding representation for each task. Participants were also encouraged to draw a new representation if they thought that neither provided choice corresponded to the given structure. To thoroughly capture student reasoning, we probed students with additional questions to elicit their explanations about why their selected choice(s) corresponded as well as why the remaining choice(s) did not correspond.

dence of the C–R–M factor is in bold. Incorrect statements and inappropriate strategy usage are in italics. Note that students' NPs have been redrawn in ChemDraw for clarity

Dee

Carl

Crystal

What information does this representation communicate? (Interpret cyclic DWD) “Wedges and dashes like cis and trans… the “The bond angles… If you knew the geome- “Um, it looks like a ring, but it's not really a wedges mean that it's coming towards try and everything, you could do the bond flat hexagon… Molecules they don't just you like it's in front of you and then this angles.” come together and are not perfectly flat. (dashes) means it's behind the structure They have electron repulsion, withdrawing, and they're always moving…” in the back.” What is the purpose of this representation? (Interpret NP) “To help with spatial awareness. And to see “…the carbon and the one behind… to see “And you can draw it like (labels carbon in where bonds, where atoms are in space. how that structure would be in different NP and draws a skeletal structure)… it's And you're supposed to be able to kind of configurations when you're moving it giving me that like the point of view that rotate that in your mind and understand around from a specific perspective. [It we're looking (draws an eye)… and we're the compound.” shows] where those specific molecules looking at it like, you know, straight like… are located… in terms of their hindrance you're looking at carbon 2 right there and with each other and what central carbon it's like the center and then carbon 3 is they're on.” right behind it…” Which of the provided choices correspond to the given representation? (Translate DWD to NP) ●● Draws skeletal structure ●● Counts carbon atoms See pattern 1 in Table 3.2 ●● Maps connectivity ●● Maps connectivity ●● Maps functional groups ●● Maps functional groups ●● Maps labeled atoms ●● Uses spatial orientation Which of the provided choices correspond to the given representation? (Translate NP to DWD) ●● Maps connectivity See pattern 2 in Table 3.2 See pattern 3 in Table 3.2 ●● Maps functional groups ●● Maps labeled atoms ●● Uses spatial orientation (continued)

Representational Competence Under the Magnifying Glass

Table 3.1  Example  quotes and drawings from the representative students. Evidence of the R–M factor is shown in plain text, and evi-

43

44

Table 3.1  (continued) Dee

Carl

Crystal

Can you generate a NP for the given DWD? (Generate NP from DWD)

Uses a variety of appropriate reasoning strategies to generate NP

Generates NP with incorrect stereochemistry

Generates NP with missing atoms. See pattern 3 in Table 3.2

Which structure is more stable? (Use NP to make inferences about stability) “Um, but if it's eclipsed and they're com“Yeah, the left (staggered conformer) it's “This (staggered conformer) has a little pletely over one another, they're not as the most, it has each, each molecule or bit more distance than this (eclipsed stable because, um, there's not enough atom has the most space by itself. It's not conformer), the methyls are further away room for electron repulsion and withtrying to repel each other. Yeah. Yeah, I here (staggered conformer). Well, actually drawing… This one (staggered conformer) think that that compound (staggered conno. They are further away here (eclipsed is, um, staggered is more [stable] than former) looks to be more stable. Just the conformer), but I think still since Br is this one (eclipsed conformer). I can tell fact that they don't have to, their charges more electronegative than the methyl, then that because of the amount of space that's aren't repelling each other.” it being closer to the CH and stuff makes it between the lines.” (staggered conformer) less stable.”

Chapter 3

Pattern 1 (conditional reasoning) Carl (translate task)

Everything on the top and the bottom of a NP must be in the plane of the main chain of a DWD. “…A is wrong because I think the CH2CH3 is supposed to be up at the top, so it doesn't have that on A. So, I think it's B… because if you look at it from the perspective of carbon three (labeled 3 in representation 3′) you do, you see the CH2CH3 on the top and then the methyl on the bottom, which is carbon five (labeled 5 in representation 3′).” Carl recognized carbon 3 (label 3 in representation 3′) as the front carbon and expected carbon 2 to point upward and carbon 5 to point downward. He did not consider the possibility of rotamers, which is why he obtained the incorrect answer.

Pattern 2 (inap- Substituents on the front carbon in a NP are on a wedge in a DWD, while substituents on the back carbon in a NP are propriate on a dash in a DWD. reasoning) “I want to say CH3 is going to be up here (indicates the CH3 labeled 3 in representation 4′) and that's, that's the one that's going to be towards you. There's CH3 down there (indicates the CH3 labeled 4 in representation 4′) and that's towards you as well… So, I want to say it's C because there's two methyl groups that are facing towards you (methyl groups on the wedge). And the other two (A and C) don't have that.” Carl (translate task)

45

Carl held the idea that the methyl groups on the front carbon in representation 4′ are coming forward. He then attempted to map this spatial relation to the wedges in the DWDs, so that the methyl groups still point forward. Carl obtained the correct answer despite using inappropriate reasoning. (continued)

Representational Competence Under the Magnifying Glass

Table 3.2  Example  of the rule-based reasoning patterns identified in the data.

46

Table 3.2  (continued) Pattern 3 (inap- Methyl groups only branch off the main chain (they are not on the terminal ends of the main chain). propriate This (B) has three methyls and this (C) has three methyls (methyls on dashes and wedges). So, I'm thinking of going to the three reasoning) groups because I see a lot of methyl groups here (circles the methyls in the representation 4′), so I'm thinking it's probably safer going with, um, one of B or C… (Continues to debate B and C and selects B) Crystal (translate task)

Crystal ignored the methyl groups on the terminal ends of DWDs when counting methyl groups. For this reason, Crystal obtained the incorrect answer.

Crystal (generate task)

“This [carbon 3 in the DWD] is this [circle in NP] and so it [carbon three in the DWD and the circle in the NP] has a CH3 group on it, then you go one back from it and see another CH3 (indicates the methyl pointing upward in the isopropyl group in the DWD). So that's why I have [two methyl groups] there [attached to the circle in the NP]. And then that's what I did.”

Chapter 3

Crystal ignored the methyl groups on the terminal ends of the DWD when constructing her NP. Her focus on the substituents branching off the main chain prompted her to incorrectly place two methyl groups on the circle in the NP. As a result, she missed carbon 4 and both terminal methyl groups in the DWD. Crystal obtained the incorrect answer.

Representational Competence Under the Magnifying Glass

47

3.3.2.2 Translating from Dash-wedge Diagrams to Newman Projections Dee was able to reason appropriately while translating from a DWD to a NP (see Figure 3.2). She successfully reasoned with the mode of the representations by labelling the carbon atoms in the DWD and mapping those labelled atoms onto the NPs. Dee's reasoning about the mode enabled her to accurately distinguish the representations that corresponded from the ones that did not. Interestingly, Dee's inability to count the implicit hydrogens while interpreting DWDs did not impact her ability to translate between DWDs and NPs. Carl was unable to reason appropriately while translating between DWDs and NPs. While interpreting NPs, Carl mentioned that NPs can show different rotational conformers, but he did not incorporate this reasoning while engaging with the translation tasks. Carl's reasoning while translating between DWDs and NPs demonstrated a rule-based reasoning pattern that everything on the top and the bottom of a NP must be in the plane of the main chain of a DWD (see pattern 1 in Table 3.2). Though this reasoning can be appropriate, it was insufficient for this problem which included rotamers (see online supplemental Figure S3.2) and led Carl to obtain incorrect answers about the translations. The strict adherence to this pattern instead of applying knowledge about the free rotation around the carbon–carbon sigma bond has been observed in previous studies with students5 and pre-service teachers.22 Crystal also struggled to reason appropriately during these translations. After repeated unsuccessful attempts to identify which carbons in the DWD corresponded to the intersecting lines and the circle in the NP (see online supplemental Figure S3.2), Crystal resorted to counting carbon atoms in the structures to eliminate incorrect options. This strategy was productive as she narrowed the option choices down to the two correct options. From here on, Crystal inappropriately mapped the connectivity of the atoms and functional groups. Ultimately, Crystal correctly selected one of the two corresponding structures, but aside from counting carbon atoms, she failed to use other reasoning strategies appropriately.

3.3.2.3 Translating from Newman Projections to Dash-wedge Diagrams More participants reasoned appropriately while translating from NPs to DWDs than in the reverse direction. Dee demonstrated appropriate reasoning by labelling the atoms in each structure, mapping the connectivity of the functional groups in each representation, and using the spatial orientation strategy to determine which structures did not correspond to the given NP. While translating from a NP to a DWD, she used the same reasoning strategies as when translating from a DWD to a NP. This highlights that she was familiar with productive strategies and used them consistently.

48

Chapter 3

In comparison to Dee, Carl focused on the spatial arrangement of atoms and groups by relying on a different rule-based reasoning pattern than the one he used when translating in the reverse direction: substituents on the front carbon in a NP are on a wedge in a DWD, while substituents on the back carbon in a NP are on a dash in a DWD (see pattern 2 in Table 3.2). Though this reasoning is inappropriate, Carl obtained the correct answers because the external features in this task (see online supplemental Figure S3.3) unintentionally fit the pattern above. Previously, while translating from a DWD to a NP, Carl also used inappropriate reasoning which led him to incorrect answers. There is a clear inconsistency between Carl's reasoning when translating in different directions, as well as a misalignment between the appropriateness of his reasoning and the correctness of his answers. Like Carl, Crystal reasoned differently with both translations. She began by counting methyl groups but used the mapping functional groups strategy inappropriately and inconsistently. She ignored the methyl groups on the terminal ends of the main carbon chain of the DWD, which demonstrates that she used the following rule-based reasoning pattern about the external features of a DWD: methyl groups only branch off the main chain (they are not on the terminal ends of the main chain) (see pattern 3 in Table 3.2). She also misattributed the isopropyl group in the NP (see representations 4′ in Table 3.2) to the tert-butyl group in the DWD (option choice B). As a result, she could not select the correct structure.

3.3.3  S  tudents' Reasoning While Generating a Newman Projection from a Dash-wedge Diagram 3.3.3.1 Description of the Generation Task This task provided students with a DWD and asked the participants to generate the most stable Newman projection along a specified bond (C2–C3) (see online supplemental Figure S3.4). To successfully complete this task, one needs to utilize multiple RC skills (i.e., translate a DWD to a NP, generate a NP, and use the NP to make inferences about stability). In this section, we will discuss how students translated between the representations to then generate the NP. In the next section, we will unpack how the students used the NP to make inferences about stability.

3.3.3.2 Generating a Newman Projection from a Dash-wedge Diagram Dee and Carl reasoned similarly during this task. Both attended to the functional groups that were connected to the front and back carbons in the DWD (R–M factor). But unlike Carl, Dee also utilized the spatial orientation strategy to think about the relative orientation of the groups and atoms in space. Even though both Dee and Carl's reasoning was appropriate, only Dee generated the correct NP (see Table 3.1). Carl drew a NP with incorrect stereochemistry.

Representational Competence Under the Magnifying Glass

49

Beyond recognizing the connectivity of the functional groups, spatial orientation appeared to be an essential reasoning strategy to obtain the correct answer. Interestingly, both Dee and Carl drew structures with the correct number of atoms, even though they could not identify the correct number of hydrogen atoms when interpreting DWDs in the interpretation task. Crystal demonstrated inappropriate reasoning while completing this task. Yet again, she was unable to map the connectivity of the functional groups while translating between the representations (R–M factor). While drawing her NP she only attended to the substituents branching off the main carbon chain. Crystal relied on the same rule-based reasoning pattern that she used during the translation task: methyl groups only branch off the main chain (they are not on the terminal ends on the main chain) (see pattern 3 in Table 3.2). As a result, Crystal drew a NP that had missing atoms. This corresponded with her inability to count atoms during interpretation tasks.

3.3.4  S  tudents' Reasoning While Using Newman Projections to Make Inferences About Stability 3.3.4.1 Description of the Use Tasks Two prompts asked the participants to use NPs to make inferences about stability. First, the participants were asked to make inferences about stability by comparing the two NPs provided in the interpretation task (see online supplemental Figure S3.1). Second, during the generation task, the participants were asked to explain why their constructed NP was the most stable conformer (see online supplemental Figure S3.4). Students were explicitly asked to reference the external features of representations that were relevant to their inferences. Below, we outline how the participants reasoned about the surface features of NPs (R–M factor) and the conceptual information embedded in NPs (C–R–M factor) while making inferences about stability.

3.3.4.2 Using Newman Projections to Make Inferences About Stability Using the NPs from the interpretation task, Dee, Crystal, and Carl each (a) mapped the functional groups (R–M factor) to recognize that the structures were conformers of the same molecule, (b) selected the staggered conformer as the more stable, and (c) explained how less space between the functional groups can cause steric strain resulting in less stable conformers (C–R–M factor) (see Table 3.1). However, Dee inappropriately identified electronegativity as the basis for the difference in stability between staggered and eclipsed conformers (C–R–M factor). In comparison, Carl and Crystal appropriately discussed the dynamic nature of the molecules, steric hindrance, and electron repulsion as the cause for the difference in stability (C–R–M factor). In the generation task, though some of the participants did not generate the correct NP, they each constructed a staggered conformer. All three participants said

50

Chapter 3

their generated structure was the most stable because the substituents were arranged as far apart as possible. This task did not elicit additional concepts beyond steric hindrance (C–R–M factor).

3.4  Summary of Findings and Conclusions 3.4.1  S  ummary of Findings Across the Tasks that Focused on Various Representational Competence Skills In synopsis, while interpreting DWDs and NPs, students reasoned less appropriately when interpreting surface features of these representations (R–M factor) than when discussing the conceptual information embedded in these representations (C–R–M factor). The most prevalent challenge with interpreting DWDs and NPs was related to an inability to count hydrogen atoms, with some students struggling to do so even when the hydrogen atoms were explicitly shown (as in the case with NPs). The cognitive elements related to the C–R–M factor also varied with the mode of representation. For example, participants discussed the dynamic nature of molecules with both DWDs and NPs, but they discussed molecular geometry and geometric isomerism in the context of DWDs only. We found that the difficulties the participants had while interpreting representations (e.g., counting implicit hydrogens) did not impact their ability to translate between representations. We also identified several rule-based reasoning patterns that some participants used to translate between DWDs and NPs. These reasoning patterns displayed larger-grain rigid ideas about associations between the external features of DWDs and NPs (R–M factor). Importantly, depending on the external features of the structures, both correct and incorrect answer choices could be supported by the rule-based reasoning patterns. This is further confirmation that the representations included in the tasks can influence student reasoning. We found that the use of a spatial orientation reasoning strategy supported students in generating a NP with correct stereochemistry (R–M factor). We also found that some students used the rule-based reasoning patterns not only when translating between DWDs and NPs, but also when generating a NP from a DWD. Finally, when using representations to make inferences about stability, each participant recognized that staggered conformers are more stable than eclipsed conformers but provided different explanations of what causes the relatively higher steric strain in eclipsed conformers (C–R–M factor).

3.4.2  Summary of Findings for Each Representative Student We explored how the three representative students reasoned about DWDs and NPs. Each participant showed proficiency in reasoning about some aspects of the representations and scientifically non-normative reasoning about other aspects.

Representational Competence Under the Magnifying Glass

51

Dee demonstrated appropriate reasoning more frequently than the other students. She could interpret the spatial relations of DWDs and NPs (R–M factor) and articulate the purpose of both representations (R–M and C–R–M factors). At the same time, she was able to count the number of atoms in the NP but could not identify all the implicit hydrogens in the DWD (R–M factor). However, her inability to count hydrogens did not negatively impact her performance in the translate or generate tasks (R–M factor). Unlike Carl and Crystal, during the translate and generate tasks, Dee did not rely on any rulebased reasoning patterns and was very consistent with the strategies that she used (R–M factor, e.g., mapping labelled atoms, mapping functional groups, and spatial orientation strategy). These strategies supported her in answering most of the questions correctly. Finally, despite being able to use NPs to make correct inferences about stability, she inappropriately attributed electronegativity as the cause of steric strain (C–R–M factor). Carl was also able to interpret the spatial relations of DWDs and NPs (R–M factor) and describe the purpose of the two representations (C–R–M factors) but could not identify the implicit hydrogens in DWDs (R–M factor). Unlike Dee, Carl reasoned inconsistently during the translation tasks. He used a rule-based reasoning pattern and arrived at correct answers when translating from DWDs to NPs (R–M factor). Contrastingly, he used a different rule-based pattern and arrived at incorrect answers when translating from NPs to DWDs. This illustrates the well-established notion that students can use inappropriate reasoning to arrive at correct answers. Like Dee, Carl's inability to count the hydrogens in DWDs did not impact his ability to generate a NP with the correct number of atoms (R–M factor). However, unlike Dee, the NP that he generated did not have correct stereochemistry because he did not use the spatial orientation strategy. Finally, Carl was able to use NPs to make correct inferences about stability and explained that steric strain is the cause of instability for the eclipsed conformer (C–R–M factor). Crystal was also able to interpret the spatial relations of DWDs and NPs (R–M factor) and to explain the purpose of each representation (C–R–M factor). However, she could not count the hydrogen atoms in either representation (R–M factor). Across the translate tasks, unlike Dee (who reasoned consistently and appropriately) and Carl (who reasoned inconsistently and inappropriately), Crystal reasoned somewhat consistently but inappropriately. She used the same reasoning strategies when translating in both directions, but the appropriateness of her reasoning fluctuated even within the tasks focusing on translations in the same direction (R–M factor). Though Crystal did not use a rule-based reasoning pattern while translating from the DWDs to NPs, she used the same rule-based reasoning pattern to arrive at incorrect answers when translating from NPs to DWDs and when generating a NP from a given DWD (R–M factor). Crystal demonstrated how the same inappropriate rule-based reasoning pattern can be used across tasks focusing on different RC skills. Crystal also generated a NP that had an incorrect number of atoms which aligns with her inability to count hydrogen atoms while interpreting DWDs and NPs (R–M factor). Despite those challenges,

52

Chapter 3

Crystal was able to explain the purpose of the two representations. For example, she reasoned about electron repulsion and the dynamic nature of molecules when interpreting DWDs (C–R–M factor). Finally, like Carl, Crystal was able to use NPs to make correct inferences about stability and explained that steric strain is the cause of instability for the eclipsed conformer (C–R–M factor).

3.4.3  Conclusions Herein, we have presented an analysis of how students reason using DWDs and NPs while completing various tasks that require different representational competence skills: interpret, generate, translate, and use representations. To our knowledge, this is the first study that captured student reasoning and multiple representational competence skills across the same representations. This work demonstrates that the appropriateness of student reasoning can vary (a) across tasks focusing on different RC skills, (b) across different representations, and (c) depending on whether the student attends to the mode of a representation (R–M factor) or the conceptual information embedded in the mode of a representation (C–R–M factor). We also identified instances of misalignment between the appropriateness of student reasoning and the accuracy of their answers to the questions. This can be attributed to the rule-based reasoning patterns that some of the students used to translate between and generate representations. As stated by Daniel and colleagues, “the way students make sense of a visualization may lead to correct responses, but this does not mean that the students have used an appropriate approach”.23 These findings provide several implications for instruction and research.

3.5  Implications 3.5.1  Implications for Instruction Given the misalignment between student answers and their reasoning, we encourage instructors to consistently provide opportunities for students to explain their reasoning. This will enable the identification of non-normative reasoning such as the rule-based reasoning patterns identified in this study. This could be best facilitated through group discussions or open-response assessments prompting students to explain their reasoning. In circumstances where instructors are unable to elicit student reasoning through the more time-demanding avenues suggested, the identified rule-based reasoning patterns can inform the writing of multiple-choice questions in which the use of a reasoning pattern would lead to a particular distractor. If students select that distractor, then the instructors will know students are likely incorporating inappropriate reasoning and can address this in future instruction. Here, we provide an example of how the rule-based reasoning patterns can inform the design of multiple-choice items (see Figure 3.3). The second rulebased reasoning pattern states that substituents on the front carbon in a Newman

Representational Competence Under the Magnifying Glass

53

Figure 3.3  Design  of a multiple-choice assessment item informed by students'

rule-based reasoning pattern that substituents on the front carbon in a Newman projection are on a wedge in a dash-wedge diagram, and substituents on the back carbon in a Newman Projection are on a dash in a dash-wedge diagram.

projection are on a wedge in a dash-wedge diagram, and substituents on the back carbon in a Newman Projection are on a dash in a dash-wedge diagram. Despite relying on this scientifically non-normative reasoning pattern, Carl selected the correct response choice during the interview (see online supplemental Figure S3.3). This information can be used to design better distractors (i.e., creating a distractor that has the potential to elicit this reasoning pattern) and answers (i.e., ensuring that students cannot use the reasoning pattern to arrive at the correct answer). Figure 3.3 presents an example item designed to capture this non-normative reasoning pattern. Response choice D is the correct answer choice. Response choices A–C are designed to elicit the rule-based reasoning pattern. With these modifications, the students cannot use the rulebased reasoning pattern above to obtain the correct answer. However, the rule-based reasoning patterns can also inform the writing of items in which neither of the answer choices aligns with the rule-based reasoning patterns so that students do not have the opportunity to engage in scientifically unsound reasoning. Finally, our study suggests that instructors should not assume student RC. One particularly concerning issue identified in this study was students' inability to recognize implicit atoms in DWDs and NPs. While this may have not hindered performance with all tasks, it posed a challenge for some students when generating NPs. Furthermore, in progressing instruction into more complex topics (e.g., reaction mechanisms, spectroscopy), prolonged reinforcement of the fundamental RC skills may support student learning.

3.5.2  Implications for Research Though this work presents an in-depth analysis of how student reasoning varies across multiple RC skills, it is a case study with three representative participants from a pilot study of only ten students. More research with

54

Chapter 3

larger samples that explicitly analyzes how reasoning varies across multiple RC skills is needed. The skills can also be further expanded. For example, this work relied on inferences about stability to analyze how students use NPs, but future studies could also investigate how students make inferences about physical or chemical properties when presented with different representations of molecular structure. Future studies should also investigate student meta-representational competence skills (e.g., the ability to select the most appropriate representation for a particular purpose, the ability to identify the affordances and limitations of representations). This work revealed that student representational competence varied with DWDs and NPs. Additional studies should investigate how students' reason with other representations of molecular structure (i.e., chair conformations, condensed structures, etc.). There is also an opportunity to incorporate Schönborn and Anderson's model of factors to investigate how reasoning about conceptual information intertwines with reasoning about the mode of the representation in the context of other representations. Finally, the participants in this study were taught by three different instructors from the same institution. Additional research should investigate how instructional practices mitigate the ways students' reason about representations.

Acknowledgements This work was supported by the National Science Foundation, DUE 2025216.

References 1. R. Kozma and J. Russell, in Visualization in Science Education, ed. J. K. Gilbert, Springer, London, UK, 2005, pp. 121–145. 2. W. Winn, Contemp. Educ. Psychol., 1993, 18, 162. 3. R. Kozma and J. Russell, J. Res. Sci. Teach., 1997, 34, 949. 4. D. F. Lohman, Spatial Ability: A Review and Reanalysis of the Correlational Literature, Technical Report No. 8, Aptitude Research Project, Stanford University, Palo Alto, CA, USA, 1979. 5. J. T. Olimpo, B. C. Kumi, R. Wroblewski and B. L. Dixon, Chem. Educ. Res. Pract., 2015, 16, 143. 6. M. Stieff, Sci. Educ., 2010, 95, 310. 7. M. Stieff, M. Hegarty and B. Dixon, in Diagrammatic Representation and Inference, ed. A. Goel, M. Jamnik and N. H. Narayanan, Springer, Berlin, Germany, 2010, pp. 115–127. 8. A. T. Stull, M. Hegarty, B. Dixon and M. Stieff, Cognition and Instruction, 2012, vol. 30, p. 404. 9. K. J. Schönborn and T. R. Anderson, Int. J. Sci. Educ., 2008, 31, 193. 10. C. W. Bowen, J. Chem. Educ., 1994, 73, 184. 11. D. G. Herrington and P. L. Daubenmire, in Tools of Chemistry Education Research (ACS Symposium Series), ed. D. Bunce and R. Cole, American Chemical Society, Washington, DC, 2014, pp. 31–59.

Representational Competence Under the Magnifying Glass

55

12. K. J. Linenberger and S. L. Bretz, Chem. Educ. Res. Pract., 2012, 13, 172. 13. J. Saldana, The Coding Manual for Qualitative Researchers, Sage Publications, Thousand Oaks, CA, 2nd edn, 2013. 14. F. M. Watts and S. A. Finkenstaedt-Quinn, Chem. Educ. Res. Pract., 2021, 22, 565. 15. V. N. Anney, J. Emerg. Trends Educ. Res. Policy Stud., 2014, 5, 272. 16. A. K. Shenton, Educ. Inf., 2004, 22, 63. 17. A. B. Flynn and N. E. Bodé, J. Chem. Educ., 2016, 93, 593. 18. A. B. Flynn and R. B. Featherstone, Chem. Educ. Res. Pract., 2017, 18, 64. 19. M. Harle and M. A. Towns, J. Chem. Educ., 2011, 88, 351. 20. J.-M. G. Rodriguez, A. R. Stricker and N. M. Becker, Chem. Educ. Res. Pract., 2020, 21, 536. 21. R. K. Yin, in Handbook of Complementary Methods in Education Research, ed. J. L. Green, G. Camilli and P. B. Elmore, Routledge, New York, 2006, pp. 111–122. 22. M. S. Boukhechem, A. Dumon and M. Zouikri, Chem. Educ. Res. Pract., 2011, 12, 331. 23. K. L. Daniel, Towards a Definition of Representational Competence, Springer, Cham, Switzerland, 2018.

      

SECTION B

Chapter 4

Fostering Causal Mechanistic Reasoning as a Means of Modelling in Organic Chemistry OLIVIA M. Crandell*a and MELANIE M. Cooperb a

University of Minnesota Rochester, Center for Learning Innovation, Rochester, Minnesota, USA; bMichigan State University, Department of Chemistry, East Lansing, Michigan, USA *E-mail: [email protected]

4.1 Introduction For a novice student, organic chemistry may appear to be a disorganized collection of reactions to be memorized, while to an expert, it is a robust landscape of cause-and-effect relationships.1 Underpinning an experts’ understanding of a chemical reaction is not just their understanding that something happens but more powerfully how and why a reaction happens, often characterized as mechanistic reasoning. Typically, the word reasoning refers to an explanation in either written or oral form but for organic chemists, reasoning about a chemical reaction is typically communicated implicitly using structures, symbols, and curved arrows. Bhattacharyya2 surveyed organic chemists nationwide to establish a consensus definition for mechanistic reasoning using the electron-pushing

  Advances in Chemistry Education Series No. 10 Student Reasoning in Organic Chemistry: Research Advances and Evidence-based Instructional Practices Edited by Nicole Graulich and Ginger Shultz © The Royal Society of Chemistry 2023 Published by the Royal Society of Chemistry, www.rsc.org

59

60

Chapter 4

formalism in organic chemistry and found that experts favored the following definitions:    The representation of the movement of electrons and atoms to demonstrate the stepwise transformation of a set of reactants into the products of a chemical process. The resulting mechanisms are ‘working hypotheses’ based on established paradigms of chemical reactivity   

A working hypothesis that allows one to rationalize or predict the outcome of a given transformation by representing the shifting of single electrons or electron lone pairs. This is mostly based on our established body of knowledge of mechanistic organic chemistry, but can occasionally venture into more exotic hypotheses    Though the definitions above do not explicitly use the term, organic chemists’ use of electron-pushing arrows can be considered as an example of scientific modelling. “Scientists use models…to represent their current understanding of a system (or parts of a system) under study, to aid in their development of questions and explanations, and to communicate ideas to others.”3 Both definitions of mechanistic reasoning assume that the electron-pushing formalism serves as a model of “established knowledge”. This assumption is perfectly reasonable for experts but studies with undergraduate and graduate organic chemistry students suggest that instructors and researchers simply should not make this assumption of novice learners.4–8 That is, novice students may use the electron-pushing formalism in ways not underpinned by established chemical knowledge, for example, by drawing arrows backwards, or drawing random or meaningless arrows.4 Even if the arrows are canonically appropriate it has been shown that some students propose mechanistic arrows only after drawing a memorized product.7 Given the evidence about students’ difficulties with the electron-pushing formalism, we might conclude that students may be using the model (i.e. drawing arrows) but not engaging in expert-like modelling (i.e. mechanistic reasoning). Such studies4–8 indicate that students’ electron-pushing mechanisms alone cannot reliably nor validly measure students’ engagement in mechanistic reasoning. To further investigate students’ reasoning about how reactions occur, research in this area has necessarily turned to eliciting explicit evidence through interviews and written responses, much of which is reviewed in other chapters in this book. In our work, we have found purposeful curriculum design and well designed, scaffolded prompts are key to guiding students to use mechanistic arrows as models in a more expert-like way. The aim of this chapter is to unpack our definition of mechanistic reasoning, discuss our emergent characterization schemes, and discuss evidence-based strategies to elicit such reasoning. Next, we will present evidence to support integration of causal mechanistic reasoning into chemistry learning environments and close with actionable strategies to help instructors foster learning environments centered around causal mechanistic reasoning.

Fostering Causal Mechanistic Reasoning as a Means of Modelling in Organic Chemistry

61

4.2 Causal Mechanistic Reasoning Underpins Expert-like Modeling In our work studying student reasoning, we explicitly use the term causal mechanistic reasoning to underscore the importance of both the causal and mechanistic components. We define causal mechanistic reasoning as an explanation that identifies causal factors and underlying entities a scalar level below the phenomenon of interest and provides a stepwise account of the activities of the entities linking the cause to the effect. This definition is informed by work from Russ, Scherr & Hammer9 and Krist, Schwarz & Reiser.10 Russ and colleagues provide a thorough review of what does and does not count as mechanistic reasoning.9 They draw on literature from the philosophy of science11 to propose a framework for mechanistic reasoning that identifies underlying entities, their properties, and their activities. Building on their work, Krist and colleagues10 have developed a simplified framework designed to work across disciplines to identify the epistemic heuristics that provide students with a way to construct explanations of phenomena. In this framework, the authors specify that the entities underlying the phenomenon must be identified at one scalar level below the phenomenon of interest.10 Krist and colleagues’ work in middle school science offers evidence of the importance of underlying entities. For example, when asked to explain how melted chocolate hardens, students had to identify entities at the molecular level which is a scalar level below the observed macroscopic level. At this molecular level, students must consider how interactions and activities of molecules cause a liquid to become a solid, and then link this molecular level explanation to the observed harden chocolate at the observable level. The intermolecular interactions that cause phase change can only be discussed by invoking those entities at the molecular level. But how does this concept of reasoning at the “scalar level below” apply when the phenomenon of interest is already being discussed at the molecular level? In the chocolate example, the explanation lies in the intermolecular interactions at the molecular level, but what if the phenomenon of interest is the causal mechanism by which intermolecular forces such as London Dispersion Forces arise? In this case, the explanation must dive down to the sub-atomic level and identify the properties and activities of electrons and protons (i.e., the “things” underlying atoms).12,13 For chemical reaction mechanisms, the phenomenon is the rearrangement of those atoms to form new substances and underlying these atoms are the electrons and their movements. Thus, a mechanistic explanation of a chemical reaction must describe the activities of electron movement. The activities of electrons are cited as necessary components in both expert definitions of mechanistic reasoning in organic chemistry shown above.2 Ideally, students will discuss electron movement in combination with their knowledge electrostatic interactions. We place significant value on students’ ability to reason about structure–property relationships and their manifestation in electrostatic interactions. This is an intentional choice

62

Chapter 4

based on literature suggesting predicting properties from Lewis structures is far from simple for general and organic chemistry students.14 Additionally, structure–property relationships and electrostatic interactions undergird numerous chemical phenomena and as such, are considered core ideas in chemistry15,16 and in the physical sciences at the K–12 level in the United States more broadly.3 While we certainly acknowledge that there are other “causes” underpinning chemical reactions such as entropy, energy, and reaction rates and equilibrium, our initial work has focused on how to support students’ use of electrostatic interactions as causal mechanisms. To do this, we have chosen to use simple systems such as Lewis acid–base reactions17,18 and an SN2 reaction to elicit deep reasoning about the role of electrostatic interactions.19 Reasoning about electrostatic interactions at the atomic and sub-atomic level inherently requires one to engage in modelling as “Models make it possible to go beyond observables and imagine a world not yet seen. Models enable predictions of the form ‘if…then…therefore’ to be made to test hypothetical explanations.”3 Experts’ use of mechanistic arrows does just that as experts use their chemical knowledge to predict how posited chemical structures determine chemical properties and how those properties cause molecules to interact with one another. An experts’ model will most likely include numerous causes such as relative nucleophile strength, relative electrophile strength, nature of the solvent, temperature, reaction time, and equilibrium, just to name a few. These causes are strung together in a sophisticated network of “if… then” cause and effect relationships—what we call a causal mechanistic explanation. Some experts use this model of electron movement to predict what types of products might form or to predict how an isolated product came to be. In either case, the electron-pushing formalism serves to model experts’ causal mechanistic thinking.

4.3 Characterizing Causal Mechanistic Reasoning Across Different Reactions In our first causal mechanistic reasoning study17 we asked students to reason about a simple acid-base reaction between HCl and H2O. Though they may not have directly named it, many students used Lewis acid–base theory by discussing the reaction in terms of electron donors and acceptors. Based on our definition of causal mechanistic reasoning, responses that identified electrons and describe their movement are considered mechanistic. Responses including both causal and mechanistic components were characterized as Lewis Causal Mechanistic (LCM). An exemplar LCM response is “The lone pair on the water molecule attracts the hydrogen from the HCl. The H–Cl bond is broken and forms a new bond with oxygen. The reaction occurs because the partial negative charge on the oxygen attracts the partial positive charge on the hydrogen. The bond between the hydrogen and Cl is less strong than the bond that forms between hydrogen and oxygen.”

Fostering Causal Mechanistic Reasoning as a Means of Modelling in Organic Chemistry

63

Though the goal was to elicit both causal and mechanistic reasoning components together, many responses included only one or none of these pieces. We developed a coding scheme to characterize the productive ideas students included in their reasoning. Responses that only discussed electron movement were characterized as Lewis Mechanistic (LM). Some responses used the Brønsted acid–base theory to explain or describe the reaction from the perspective of proton donors and acceptors. Responses that explicitly discussed a proton transfer were characterized as Brønsted Descriptive and those that explicitly discussed the proton’s attraction to the acceptor were characterized as Brønsted Causal since they invoked the causal component. Responses that only described the bonds that were broken and formed were characterized as General Descriptive. This coding scheme is summarized in Figure 4.1A.

Figure 4.1 Causal mechanistic reasoning characterization schema for an acid-base reaction (A) and a nucleophilic substitution reaction (B).17–19 Reproduced from ref. 17–19 with permission from American Chemical Society, Copyright 2017, 2019, 2020.

Chapter 4

64

The causal mechanistic characterization scheme used for the reaction of HCl and H2O was modified slightly to capture new ideas that emerged for the reaction of CH3Br and OH−.19 Since this reaction did not involve a proton transfer, we consolidated the two descriptive bins together and eliminated the nomenclature attributed to the Brønsted and Lewis acid–base models as shown in Figure 4.1B. It is important to note that the Descriptive Mechanistic (DM) and Descriptive Causal (DC) characterizations are not necessarily hierarchical because both types of responses can be improved in some way.

4.4 Eliciting Causal Mechanistic Reasoning— Attention to Scaffolding Our work in eliciting causal mechanistic responses suggests that appropriate prompting and scaffolding are fundamental to assessing what students know and can do when constructing a causal mechanistic response.17 Wood and colleagues20 introduced the term scaffolding as a metaphor for tutoring to support students in problem-solving. We use the term scaffold to describe the features of the prompt given to students asking them to write or draw. In this early work, Wood and colleagues identified different ways that scaffolding can function to support learners such as reducing the degrees of freedom by simplifying the task into more manageable sub-tasks so the learner can better gauge their success.20 This has been particularly relevant to our causal mechanistic reasoning task structure. In our early work eliciting causal mechanistic reasoning, we asked students to classify the reaction and explain their reasoning for the reaction of HCl with H2O.17 Next, students were asked to explain what they thought was happening at the molecular level (Figure 4.2A). Students

Figure 4.2 Evolution of prompts to elicit evidence of student engagement in causal

mechanistic reasoning. The initial iteration of the acid-base prompt (A), the revised acid-base prompt (B), and the nucleophilic substitution prompt (C). The Acid-Base prompts are reproduced from ref. 17 with permission from American Chemical Society, Copyright 2017. The SN2 Prompt is reproduced from ref. 19 with permission from American Chemical Society, Copyright 2020.

Fostering Causal Mechanistic Reasoning as a Means of Modelling in Organic Chemistry

65

were provided with Lewis structures for both reactants and products. With this initial iteration of the prompt, we found that very few students invoked causal and mechanistic components and kept their description at a grain-size above the mechanistic sub-atomic level.17 We found that only 8% of general chemistry students generated an LCM response (Figure 4.3A). Some students (40%) gave a descriptive response which did not provide the evidence of student understanding that we were seeking, especially since students were given the reaction products. The problem, we believed, was in the prompt structure, because students did not understand what we meant by “explain”. The initial prompt did not give students enough information about what was expected as a response. It may well have been that students were fully capable of giving a causal mechanistic response, but they did not find the causal and mechanistic pieces to be pertinent to the question being asked. That does not mean that the students’ descriptions of bond breaking and forming were inherently incorrect as students frequently identified proton transfers to and from the correct species, but rather they lacked the desired depth—a causal mechanistic response. There is a key epistemological difference between responses that are considered incorrect and responses that are considered surface level or incomplete. We revised the prompt separating “what” and “why” into distinct sub-tasks communicating that “what” and “why” are indeed different components (Figure 4.2B). This was intended as an invitation to describe the reaction— our goal was to help students understand the difference between a description and an explanation. Then we asked why the reactants form the products. Since students had already provided a description, we hoped that they would

Figure 4.3 Characterization of causal mechanistic reasoning across two acid-base

prompt structures. Responses elicited with the initial prompt structure are shown in A and responses elicited with the revised prompt structure are shown in B. (GD = General Descriptive, BD = BrØnsted Descriptive, BC = BrØnsted Causal, LM = Lewis Mechanistic, LCM = Lewis Causal Mechanistic). Reproduced from ref. 17 with permission from American Chemical Society, Copyright 2017.

66

Chapter 4

better understand what we were asking in the second “why” question. Finally, students were asked to draw mechanistic arrows to accompany their explanation (Figure 4.2B). This revised prompt was administered to a second group of general chemistry students who were comparable to the students in the initial group in that both groups were enrolled in a transformed general chemistry course called Chemistry, Life, the Universe and Everything, abbreviated CLUE,21 and no meaningful differences were found in their course grades. The revised prompt elicited a four-fold increase in number of Lewis Causal responses (Figure 4.3). We attribute this change in responses to the increased scaffolding of the revised prompt that encouraged students to think about the phenomenon in ways that they normally would not. In the context of reaction mechanisms, we see a connection between the productive ideas in students’ responses and their correct use of the electronpushing formalism as a model. Since the revised prompt asked students to draw mechanistic arrows in addition to their written explanation, the prompt also creates a unique opportunity to target students’ engagement in modelling. We found that, in general, as the sophistication of the written response increased, the ratio of correct to incorrect arrows also increased. In a follow up study with organic students and the reaction of CH3Br and OH−, we modified the prompt further to help guide students’ thinking by activating relevant knowledge to engage in causal mechanistic reasoning. Students were asked to classify, describe, explain, draw mechanistic arrows (Figure 4.2C) and to explain their arrow drawings after students drew their arrows. By asking students to specifically explain their drawing, we hoped to elicit evidence of students engaging in expert-like modelling by connecting representations to underlying knowledge. In fact, asking students to draw mechanistic arrows and then discuss their drawing (Figure 4.2C) encouraged more students to include explicit discussion of lone pairs and their activities. Simply by adding this extra reasoning task, we showed that the number of responses that included an explicit mechanistic component increased from 44% after parts i–iii, to 66% of responses after part v. That is, prompting students to discuss why they were drawing mechanistic arrows elicited more complete causal mechanistic explanations encouraging more students to engage in expert-like modelling. Making this explicit connection between drawing and explanation is particularly important scaffolding for students because evidence suggests that students view models to show or describe a phenomenon rather than as a way to explain or predict.22

4.5 Causal Mechanistic Reasoning in Organic Chemistry We have also conducted two longitudinal causal mechanistic reasoning studies in the context of organic chemistry over the course of two semesters.18,19 The first asked students to reason about the reaction of HCl with H2O18 and the second used a simple SN2 reaction.19

Fostering Causal Mechanistic Reasoning as a Means of Modelling in Organic Chemistry

67

For the first study, we investigated the possible impact of students’ general chemistry course experience on their reasoning in a traditional organic chemistry course. We define a traditional course as one that follows a commercially available textbook with assessments that do not require students to show evidence of reasoning. We administered the task to students while they were enrolled in a traditional organic chemistry course and then identified student cohorts based on their prior general chemistry course experience. One group of students had completed two semesters of the transformed CLUE general chemistry course.21 The rest of the students had either taken a more selective general chemistry course or had not taken a general chemistry 2 course at all but had a mixture of general chemistry 1 course experiences. Surprisingly, we found that there were no significant differences between causal mechanistic reasoning trends for the students who had a selective general chemistry experience and the students who had no general chemistry 2 course thus, their consolidation into one group. It is important to note that all the students in this study were enrolled in the same traditional organic chemistry course and were followed longitudinally. Students responded to the acid–base prompt shown in Figure 4.2B at both the start and end of their two-semester course. CLUE students had also responded to the prompt at the end of their general chemistry 2 course allowing us to compare CLUE students’ reasoning across three time points. At the end of general chemistry 2, 40% of CLUE students constructed a causal mechanistic response (characterized as Lewis Causal Mechanistic). Little had changed at the start of organic chemistry 1, and by the end of organic chemistry 2 this percentage had increased to 57% percent as shown in Figure 4.4A. In contrast, only 13% of students in the other cohort started their organic chemistry engaging in causal mechanistic explanation. Most students constructed Brønsted Descriptive responses (32%) or Brønsted Causal responses (23%). By the end of organic chemistry 2, there was a shift from descriptiveonly and causal-only responses to mechanistic responses with 30% percent of students constructing a Lewis Mechanistic response and 40% constructing Lewis Causal Mechanistic responses as shown in Figure 4.4B. While, as expected, all students improved markedly in their ability to construct causal mechanistic explanations over two semesters of organic chemistry, students who took CLUE general chemistry began and ended with a greater proportion of causal mechanistic reasoning. This evidence suggests that student engagement in causal mechanistic reasoning can be sensitive to instruction specifically aimed to foster such reasoning, and that such instruction can promote long-lasting improvements in student reasoning. We will return to a more detailed discussion about the key features of the transformed CLUE course in a later section. In a second longitudinal study,19 we investigated reasoning about an SN2 reaction of CH3Br and OH− shown in Figure 4.2C. Students in this study were enrolled in two different organic chemistry courses: a transformed organic chemistry course, Organic Chemistry, Life, the Universe and Everything (OCLUE)23 or a traditional course that followed a traditional text with

68

Chapter 4

Figure 4.4 Comparison of causal mechanistic reasoning characterization for the

reaction of HCl and H2O. Students enrolled in the CLUE general chemistry curriculum are shown in A and students enrolled in a Selective general chemistry experience or no general chemistry 2 are shown in B. (GD = General Descriptive, BD = BrØnsted Descriptive, BC = BrØnsted Causal, LM = Lewis Mechanistic, LCM = Lewis Causal Mechanistic). Reproduced from ref. 18 with permission from American Chemical Society, Copyright 2019.

traditional assessments. Both cohorts were sampled immediately after they learned about nucleophilic substitution in their respective courses (Mid OC1) and then again at the end of organic chemistry 2 (End OC2). Comparing reasoning across these instructional contexts allowed us to understand possible effects of organic chemistry curricula on student engagement in modelling in their organic course.

Fostering Causal Mechanistic Reasoning as a Means of Modelling in Organic Chemistry

69

Figure 4.5 shows student reasoning for these two cohorts across two time points. Both cohorts had similar reasoning patterns with around 55% causal mechanistic, immediately after they learned about SN2 reactions in their courses. However, by the end of organic chemistry 2, the percent of OCLUE students engaged in causal mechanistic reasoning increased to over 60%, while the Traditional students’ percent decreased to 40%. A repetition of this study the following year showed similar results.19 We attribute these differences to differences in course culture and structure. The OCLUE curriculum treats explanations not just as something to be delivered to students to help them learn but rather explanations are treated as an essential scientific practice that students themselves should engage in to deepen their understanding. OCLUE students are encouraged to use their knowledge of electrostatic interactions and structure–property relationships to construct explanations about various phenomena in low-stakes homework assignments, discussion activities, and summative assessments. We believe this consistent emphasis on explanations develops

Figure 4.5 Comparison of causal mechanistic reasoning characterization for the

reaction of CH3Br and OH−. Students enrolled in OCLUE are shown in A and students enrolled in a traditional organic chemistry course are shown in B. (DG = Descriptive General, DC = Descriptive Causal, DC = Descriptive Mechanistic, CM = Causal Mechanistic). Reproduced from ref. 19 with permission from American Chemical Society, Copyright 2020.

70

Chapter 4

a unique course culture where students understand that reasoning is valued.24 We will further discuss instructional strategies to foster such reasoning in a later section.

4.6 Characterizing the Relationship Between Reasoning and Arrow Drawings To understand how students’ reasoning relates to their mechanistic arrows, we coded the accuracy of students’ mechanistic arrows and mapped those results onto their reasoning as shown by the stacked bar graphs in Figures 4.4 and 4.5. In the first longitudinal study with the acid–base reaction, we found that, for CLUE students there is a trend between students’ causal mechanistic reasoning and the percent of correct arrows at all three time points. For example, at the end of general chemistry 2, when 40% of CLUE students engaged in causal mechanistic reasoning most of those responses were associated with correct mechanistic arrows. By the end of organic chemistry 2, almost all of CLUE students’ causal mechanistic responses were also associated with correct arrows. Overall, we found that CLUE students were more successful drawing arrows which makes sense because these students were taught to use mechanistic arrows in their general chemistry course which gave them an advantage over the comparison group, particularly at the start of organic chemistry. For the selective general chemistry group at the start of organic chemistry; student responses were largely descriptive with largely incorrect mechanistic arrows. Again, this is not surprising since these students had just been introduced to mechanistic arrows for the first time. By the end of organic chemistry these students have improved considerably but never “catch up” to the CLUE students. It is notable that by the end of organic chemistry 2, we would expect all students should be able to draw correct mechanistic arrows for a simple acidbase reaction. The reaction of an acid with water is perhaps the most fundamental chemical mechanism in organic chemistry. Examples include the mechanism of a proton transfer that is often implicit in the fundamental formation of hydronium in aqueous acid, carbonyl activation via simple proton transfer in acidic conditions, and in fact proton transfers account for most mechanistic steps catalogued in the MACiE database.25 In this first study, we characterized students’ mechanistic arrows simply as correct or incorrect, but in our second study we took the analysis of student drawings one step further. For the reaction of CH3Br and OH−, we also analyzed whether students drew arrows corresponding to the process they described or explained in their reasoning. In most cases, students were consistent between what they drew and explained, but there were a few instances where students drew one process and explained the other which we denote as non-matching arrows and explanations. These data are presented in the stacked bar graphs in Figure 4.5.

Fostering Causal Mechanistic Reasoning as a Means of Modelling in Organic Chemistry

71

We found that traditional students’ engagement in causal mechanistic reasoning does not necessarily correspond to correct reasoning about an SN2 reaction. Right after learning about these two processes in their course, Traditional students who engaged in causal mechanistic reasoning were just as likely to draw and explain an SN1 process as they were to draw and explain an SN2 process. While drawing and explaining an SN1 process represents consistency in their modelling, it is not a feasible mechanism for the given reaction. In contrast, OCLUE students were largely accurate and consistent in their reasoning and arrow use both when they first learn about these reactions and continue to excel at the end of two semesters. Evidence from these studies with organic chemistry students suggests that instructors might be over estimating students’ ability to use reaction mechanisms to model their understanding of the simplest chemical processes, a conjecture which supports other work in organic mechanistic reasoning.4–8

4.7 Summary These two longitudinal studies suggest that engagement in causal mechanistic reasoning and modelling is highly sensitive to curricular influence. In the first study, we found that even for a simple acid-base reaction, students who had taken a transformed general chemistry course (CLUE) in which they were expected to construct explanations and models on a weekly basis, were, not surprisingly, more likely to engage in causal mechanistic reasoning than peers who had not had such experiences in their general chemistry courses. We also see more competent use of mechanistic arrows for students enrolled in these transformed courses. Additional studies comparing OCLUE students to traditional students have found that OCLUE students are more likely to construct reasonable arrow drawing mechanisms for a reaction they have never seen before.26 In this study, OCLUE students were more willing and better prepared to attempt to use their knowledge to reason about an unknown reaction mechanism. While 42% of OCLUE students generated a plausible product for a reaction they had not previously seen, only 8% of traditional students were able to do the same. Up until now, we have described the design of these transformed courses. We will conclude this chapter by discussing key curricular and pedagogical strategies that appear foster causal mechanistic reasoning.

4.8 Strategies for Fostering Causal Mechanistic Reasoning in Learning Environments If we want students to engage in mechanistic reasoning in expert-like ways, we must help students overcome tendencies to memorize mechanistic arrows and reproduce memorized schemes when it comes to modelling reaction mechanisms. Students value the knowledge and practices they will find on their assessments, and therefore, if we want students to engage in certain

Chapter 4

72 24,27

practices, our assessments must incentivize that engagement. However, we cannot simply expect students to be able to do this on their own. The CLUE21 and OCLUE23 curricula engage students in modelling, explanation, and argument from the very first day of class by teaching students how to construct scientific explanations and offering low-stakes opportunities to practice engaging with scientific practices in lecture, in small group discussions, and on homework assignments. Students are provided with feedback to guide their development toward more expert-like reasoning strategies that invoke relevant knowledge and carefully link evidence and reasoning as suggested by McNeill and Krajcik.28 All of these activities are practiced in the context of low-stakes formative assessment tasks, that are not graded for correctness—but rather for good faith effort. When mechanistic arrows are introduced, underlying explanations are emphasized as necessary complements to the representation on summative exams throughout the course. Instructors hoping to elevate their students’ thinking to engage in causal mechanistic reasoning can consider the many ways that explanation can be explicitly incorporated into their syllabus on formative and summative assessments as well as in the form of learning objectives that define what students are expected to know, and most important, what students are expected to do with their knowledge.29 While it may seem obvious—the evidence presented in this chapter suggests that if we want students to engage in mechanistic reasoning in an expert-like way, we must teach them how to do that.

Acknowledgements We would like to thank the organic chemistry instructors and students who participated in our study. This work was supported by the National Science foundation under DUE 1502552.

References 1. N. P. Grove and S. L. Bretz, Chem. Educ. Res. Pract., 2012, 13, 201. 2. G. Bhattacharyya, J. Chem. Educ., 2013, 90(10), 1282. 3. National Research Council, A Framework for K-12 Science Education: Practices, Crosscutting Concepts, and Core Ideas, National Academies Press, Washington, D.C., 2012. 4. G. Bhattacharyya and G. M. Bodner, J. Chem. Educ., 2005, 82(9), 1402. 5. G. Bhattacharyya, Chem. Educ. Res. Pract., 2014, 15, 594. 6. T. L. Anderson and G. M. Bodner, Chem. Educ. Res. Pract., 2008, 9, 93. 7. N. P. Grove, M. M. Cooper and K. M. Rush, J. Chem. Educ., 2012, 89(7), 844. 8. A. B. Flynn and R. B. Featherstone, Chem. Educ. Res. Pract., 2017, 18(1), 64. 9. R. S. Russ, R. E. Scherr, D. Hammer and J. Mikeska, Sci. Educ., 2008, 92(3), 499.

Fostering Causal Mechanistic Reasoning as a Means of Modelling in Organic Chemistry

73

10. C. Krist, C. Schwarz and B. Reiser, J. Learn. Sci., 2019, 28(2), 160. 11. P. Machamer, L. Darden and C. F. Carver, Philos. Sci., 2000, 67(1), 1. 12. N. Becker, K. Noyes and M. M. Cooper, J. Chem. Educ., 2016, 93(10), 1713. 13. K. Noyes and M. M. Cooper, J. Chem. Educ., 2019, 96(9), 1821. 14. M. M. Cooper, L. M. Corley and S. M. Underwood, J. Res. Sci. Teach., 2013, 50(6), 699. 15. J. T. Laverty, S. M. Underwood, R. L. Matz, L. A. Posey, J. H. Carmel, M. D. Caballero, C. L. Fata-Hartley, D. Ebert-May, S. E. Jardeleza and M. M. Cooper, PLoS One, 2016, 11, e0162333. 16. M. M. Cooper, L. A. Posey and S. M. Underwood, J. Chem. Educ., 2017, 94(5), 541. 17. M. M. Cooper, H. Kouyoumdjian and S. M. Underwood, J. Chem. Educ., 2016, 93(10), 1703. 18. O. M. Crandell, H. Kouyoumdjian, S. M. Underwood and M. M. Cooper, J. Chem. Educ., 2018, 96(2), 213. 19. O. M. Crandell, M. A. Lockhart and M. M. Cooper, J. Chem. Educ., 2020, 97(2), 313. 20. D. Wood, J. S. Bruner and G. Ross, J. Child Psychol. Psychiatry, 1976, 17(2), 89. 21. M. M. Cooper and M. Klymkowsky, J. Chem. Educ., 2013, 90(9), 1116. 22. K. Lazenby, C. A. Rupp, A. Brandriet, K. Mauger-Sonnek and N. M. Becker, J. Chem. Educ., 2019, 96(3), 455. 23. M. M. Cooper, R. L. Stowe, O. M. Crandell and M. W. Klymkowsky, J. Chem. Educ., 2019, 96(9), 1858. 24. R. Bowen, A. Flaherty and M. M. Cooper, Chem. Educ. Res. Pract., 2022, 23(3), 560. 25. G. L. Holliday, D. E. Almonacid, J. B. O. Mitchell and J. M. Thornton, J. Mol. Biol., 2007, 372(5), 1261. 26. S. K. Houchlei, R. R. Bloch and M. M. Cooper, J. Chem. Educ., 2021, 98(9), 2751. 27. R. L. Stowe and M. M. Cooper, J. Chem. Educ., 2017, 94(12), 1852. 28. K. L. McNeill and J. S. Krajcik, Supporting Grade 5-8 Students in Constructing Explanations in Science: The Claim, Evidence, and Reasoning Framework for Talk and Writing, Pearson, United Kingdom, 2012. 29. M. M. Cooper, J. Chem. Educ., 2015, 92(8), 1273.

Chapter 5

Students’ Reasoning in Chemistry Arguments and Designing Resources Using Constructive Alignment Jacky M. Deng†a, Myriam S. Carle†a and Alison B. Flynn*a a

University of Ottawa, 10 Marie Curie Pvt, Ottawa, K1N 9A4, Canada *E-mail: [email protected]

5.1  Introduction 5.1.1  C  itizens Need to be Able to Reason with Scientific Evidence In a world facing complex global issues, citizens need to be able to make and justify claims informed by scientific evidence and reasoning.1,2 National frameworks for science education around the world have also identified that constructing evidence-based arguments is a key scientific practice.2–5 Arguments justify a fact or phenomenon that is not agreed-upon;6,7 a claim is in doubt and must be advanced through reasoning by constructing an argument about the fit between the evidence and claim.8–10 For example, †

Co-lead authors.

  Advances in Chemistry Education Series No. 10 Student Reasoning in Organic Chemistry: Research Advances and Evidence-based Instructional Practices Edited by Nicole Graulich and Ginger Shultz © The Royal Society of Chemistry 2023 Published by the Royal Society of Chemistry, www.rsc.org

74

Students’ Reasoning in Chemistry Arguments and Designing Resources

75

making a decision (claim) of whether to vaccinate requires choosing sources of evidence, interpreting the quality of the available evidence, and using this evidence to reason for or against a particular decision.11 In science and chemistry education, students’ written and oral arguments about phenomena have been used to gain insight into students’ reasoning.8,10,12–22 Reasoning has been the focus of many chemistry education research investigations analyzing students’ arguments and other contexts. There are several reasoning frameworks,23–38 and each has been used to study different ways of thinking about students’ approaches to problems and phenomena in chemistry and other fields. Our research in student reasoning arose from anecdotal observations in exam responses, in which some students often stated evidence alone as justification for a given phenomenon, without going further to explain the relevance or impacts of that evidence (the “who cares”). While justifying phenomena (scientific argumentation) had been a long-time expectation in our department, a new curriculum that focused on patterns and principles of reactivity made scaffolding students’ scientific argumentation skill development even more of a priority.39,40

5.2  F  ramework—Reasoning, Granularity, and Comparisons We developed an analytical framework that characterizes arguments in terms of three dimensions: reasoning, granularity, and comparisons (Figure 5.1), which we describe below. In the chemistry education literature, students’ arguments have primarily been analyzed with a focus on students’ reasoning. However, characterizing students’ reasoning does not necessarily capture other dimensions of an argument that may be of interest to chemistry

Figure 5.1  Overview  of the framework: levels of comparison, granularity, and

mode of reasoning. Reproduced from ref. 21 with permission from the Royal Society of Chemistry.

76

Chapter 5

educators and researchers, such as discussions at various scalar levels and/or comparisons against alternative claims.

5.2.1  Modes of Reasoning We analyzed students’ arguments using four modes of reasoning: descriptive, relational, linear causal, and multi-component causal (Table 5.1).30,41 Descriptive arguments list or give features and/or the properties of entities (e.g., the reactants, products) without establishing connections. Relational arguments include connections between properties of the entities and their activities, but these relationships are discussed in a correlative fashion (i.e., lack causality). Causal arguments—linear and multi-component—include all features of a relational argument and additionally contain cause-and-effect relationships between the relevant properties of the entities and their activities. The modes have been used across various chemistry contexts to characterize students’ reasoning in written and verbal argumentation.12,20,21,35,42,43 The type(s) of reasoning needed can vary depending on the task/context. In some contexts, causal reasoning may be helpful, while in others, descriptive or relational reasoning may be more practical and appropriate for accomplishing a task in educational and scientific settings. For example, how students choose to determine the direction of an acid–base equilibrium may differ on a final exam compared to a lab context. On a final exam, students may choose to construct an argument with causal reasoning to demonstrate their understanding (comparing pKa values and discussing why the pKa values are different based on chemical factors). In contrast, students may instead demonstrate only descriptive or relational reasoning (comparing pKa values) to accomplish a similar task in the lab.

5.2.2  Levels of Granularity—Moving Between Grain Sizes In addition to different modes of reasoning, arguments can also be constructed at different levels of granularity. For instance, justifications for why acetylsalicylic acid acts as an acid in water may focus on pH and pKa data (i.e., a phenomenological level) or how the stability of the deprotonated form of acetylsalicylic acid is stabilized by delocalization (i.e., structural, electronic, and energetic factors).44 Both justifications discuss a cause-and-effect but are constructed at different “scales” or levels of granularity.45 The idea of granularity has been described in other work related to scientific reasoning, including scales,46 levels,47 nested hierarchies,48 emergence (with ideas of downward and upward causality),49 and bottom-out reasoning.50 Chemists need to be able to move between different levels of granularity.49,51,52 Scientists may discuss how interactions between components at a submicroscopic level manifest into macroscopic properties.49,51,52 Different contexts and tasks may require different levels of granularity, as phenomena can be explained from larger perspectives (e.g., global levels and beyond) or smaller perspectives (e.g., atomic levels and beyond).50

Students’ Reasoning in Chemistry Arguments and Designing Resources

77

Table 5.1  Modes  of reasoning. Reproduced from ref. 21 with permission from the Royal Society of Chemistry.

Mode of reasoning Descriptive

Relational

Linear Causal

Multi-Component Causal

Description

Example

NaH is a strong base so it will Claims are supported solely drive the equilibrium to by describing features and/ products. or the properties of entities (e.g., the reactants, products, etc.). No connections are established. NaH will drive the equilibrium Relationships between the towards products because its properties of the reaction conjugate acid has a higher pKa materials and their activities are discussed in a correlavalue (35) than the alkyne (25). tive fashion (i.e., a cause– effect relationship is not established). All the features of a relational NaH will drive the equilibrium towards products because its argument are present and conjugate acid (pKa value 35) are accompanied by causeand-effect relationships is weaker/more stable than the between the relevant propstarting alkyne (pKa value 25), erties of the entities and according to the relevant pKa their activities. values. NaOH and NH3 are not strong enough bases to drive the equilibrium to the product side, as both of their conjugate acids are stronger than the starting alkyne (pKa values of 15.7 and 9, respectively). The oxygen atom in NaOH is Phenomena are seen as more electronegative than the the result of the static or carbon atom in the conjugate dynamic interplay of more base, making it better able to than one factor and the stabilize the negative charge. direct interactions of several Although atomic size (larger components. carbon atom) and hybridization suggest that the carbon atom in the conjugate base could be more stable, the pKa values of the acid and conjugate acids support the conclusion that NaOH is the weaker base, since the stronger the acid, the weaker its conjugate base (pKa value of H2O = 15.7; pKa of alkyne = 25). NH3 is similarly too weak to drive the equilibrium to the product side, with the nitrogen atom being neutral in addition to being more electronegative than the carbon anion.

78

Chapter 5

In our research, we characterized the granularity exhibited in students’ organic chemistry arguments in terms of four levels of granularity developed from literature: phenomenological, structural, electronic, and energetic.21,44,47–49,53

5.2.3  Comparison—Considering Alternatives A comparison is needed when an argument involves two or more possible claims since reasoning and granularity do not capture how one considers alternatives within an argument.9 Comparing alternatives is key to argumentation—when deciding on a claim, one must consider multiple possibilities and justify their claim by weighing various factors, evidence, etc. For example, justifying why phenol is a stronger acid than ethanol may include a description of why phenolate can delocalize and why ethoxide cannot. Arguments may include full comparisons (comparing choices by considering all factors), partial comparisons (comparing choices by considering some, but not all factors), or no comparisons (not comparing at all).

5.3  Students’ Arguments Can Vary Between Tasks We characterized the reasoning, granularity, and comparisons demonstrated in students’ written arguments on three organic chemistry final exam items: (1) comparing two reaction mechanisms and justifying which was more plausible (N = 159);20 (2) comparing three bases and justifying which will drive an equilibrium to products (N = 170);21 and (3) determining resonance structures’ relative contributions to the respective resonance hybrid (N = 284)43 (Figure 5.2). Across the three questions, we found that students’ responses varied in terms of their reasoning, granularity, and comparisons (Figure 5.3). Educators’ expectations about students’ arguments (based on intended and enacted course learning outcomes (LOs)) were different between each question we analyzed. However, within the subset of responses for each question, students’ arguments aligned with the expected reasoning level.20,21,43 In the question comparing mechanisms (Q1), students had been taught to compare the structure of the starting material and explain how and why the structural features informed reaction plausibility, but the answer could be obtained using a simple rule-of-thumb (e.g., relative carbocation stability guides reactivity, with methyl ≪ primary < secondary < tertiary). The educator explicitly explained and modeled that the rule was not enough to answer the question and that students had to explain why the rule was relevant. In other words, students’ reasoning was aligned with how they were taught and with the course expectations. In the acid–base question (Q2), the question could be answered with two different levels of granularity. Either was expected and accepted: energetic and electronic factors (e.g., formal charge, electronegativity) or phenomenological factors (e.g., pKa values); answers were distributed as such for this question. Similarly,

Students’ Reasoning in Chemistry Arguments and Designing Resources

79

Figure 5.2  The  three questions analyzed and exemplar quotes from students that reflect the expected arguments for each task.

Figure 5.3  Summary  of the reasoning, granularity, and comparisons demonstrated in students’ arguments for the three questions.

for the delocalization question (Q3), the students used the same reasoning they were taught, aligned with expectations. The arguments we analyzed demonstrated that students could produce sophisticated justifications of chemical phenomena in some tasks while producing predominantly descriptive arguments in other tasks. The structure of students’ arguments may therefore be associated with various factors, such as the task’s expectations, the task’s structure, LOs, student knowledge, and

Chapter 5

80

Figure 5.4  Students’  responses along with (A) misaligned factors, and (B) aligned factors.

student motivation. Each of these factors should be carefully communicated throughout a course, to make expectations clear and best support students’ learning. Misalignment of these factors with an expected answer (Figure 5.4, top) could lead to student confusion about expectations and result in answers that do not meet the expectations, despite students’ abilities to meet them. With effective alignment of the factors, students have a higher chance of knowing the expectations and being able to provide the expected answer (Figure 5.4, bottom). In the following sections, we will describe how readers can use constructive alignment to help students develop and demonstrate their reasoning, granularity, and comparisons within arguments.

5.4  S  upporting Student Learning Through Constructive Alignment 5.4.1  Instructional Design Ideally, the learning environment is inclusively designed in a way that aligns the intended LOs with instruction and assessment—this is known as constructive alignment54,55 and represents a shift in focus from what

Students’ Reasoning in Chemistry Arguments and Designing Resources

81

the educator does to supporting what learners need. Viewed through the lens of constructive alignment, supporting students’ explanations of chemical phenomena requires not only changing instructional practices, but also aligning the LOs, assessment, pedagogy, and other instructional decisions.56,57 One way to facilitate constructive alignment is through backwards design58 in which educators design activities, assessments, and a learning environment by: (1) identifying the intended LOs, (2) developing practice opportunities and identifying acceptable evidence of achievement (i.e., design of formative and summative assessments, including scaffolds), and (3) planning the learning experience (i.e., pedagogy and other instructional decisions). A general LO could take the form: justify the reactivity for [reactivity learned in Module x] by comparing the relevant entities and activities at [level of granularity] using [descriptive, relational, linear causal, or multi-component causal] reasoning (Table 5.2). An LO like this can be used to clearly communicate expectations and provided as metacognitive tool for students.59 Such LOs can also be communicated in multiple ways: in a course syllabus, at the beginning/end of each lecture, class, etc. Once the LOs are established, we can analyze how they are enacted by looking at how they are taught, practiced and assessed.40,56,60 Assessment is how we can determine whether students have demonstrated acceptable achievement of LOs—we ask: what evidence would demonstrate that a student has achieved the intended LOs? From this, educators can begin to make decisions about the type(s) of assessment needed to adequately assess students’ achievement of the LOs. Similarly, students can use the LOs as a tool to create their own mock practice questions. Table 5.2  The  prompts use the LOs and explicitly demonstrate the expectation from each component of the reasoning framework.

Question

Revised prompt

Q1: Comparing mechanisms

Which reaction is most likely to proceed by the mechanism shown? Justify your answer with multi-component causal reasoning by comparing the two reaction coordinate diagrams using an explanation at the molecular level. Circle the base below that can be used to drive the equilibrium to products. Justify your answer using at least linear causal reasoning at the reaction or molecular level of granularity (why you chose one base and did not chose the others) using chemical structures as part of your answer. Rank the resonance structures in order of contribution to the hybrid. Justify your answer at the descriptive level by comparing the resonance structures.

Q2: Acid–base

Q3: Delocalization

82

Chapter 5

Lastly, with the LOs, assessments, and expectations in hand, instruction can be designed to explicitly focus on the desired outcome. If educators expect learners to achieve a specific LO, explicitly teaching it and assessing it throughout a course sequence provides learners with the tools, opportunities, and knowledge needed to succeed.56 Followed in this manner, backwards design that starts with LOs can ensure that students have the necessary learning opportunities throughout a course to successfully develop and demonstrate each LO.

5.4.2  Scaffolding Skill Development Students’ reasoning skill development can also be supported through scaffolds, supports, and intermediate steps that guide students toward the intended LO.61 In the context of argumentation, scaffolds can provide the structure for students to practice constructing arguments with specific components in mind (e.g., scaffolds that explicitly ask for claim, evidence, and reasoning). In accordance with constructive alignment, LOs and expectations need to guide the design of the scaffolds to build students’ skills. For example, researchers have proposed ways to design scaffolds for organic chemistry that focus on outcomes related to mechanistic reasoning.29,62 Another example could be using the fine-grained LOs seen in Table 5.2 as scaffolded steps. For the acid–base question analyzed in our research, a scaffold could first ask the students to provide the pKa values of the protonated species (granularity), compare the three pieces of evidence (comparison), explain why pKa is used in this case (causal reasoning) and finally state a claim. This type of scaffolding lays out the expectations for the learner, ensuring that they are aware of what they must demonstrate within their argument. Additionally, prompting students to think about what they have learned from scaffolds and to apply their knowledge to new tasks could help them develop metacognitive and transfer skills.62,63 As time goes by, and students’ skills and learning expectations increase, scaffolds can also be removed or faded, though research is needed on the timing and impacts of fading scaffolds on students’ arguments in chemistry.7 The following example of scaffolding comes from an online organic chemistry course with 300 students on Zoom. The LO for the class was Identify the claim, evidence, and reasoning within a pre-written chemistry argument. Before class, students watched videos on scientific argumentation and reasoning, including modes of reasoning, comparisons, and granularity.64,65 They also completed a pre-class quiz to assess their basic comprehension of the videos. In class, students first answered a multiple-choice question using an online response system: “Which substrate is most likely to react by the first step indicated, A or B?” (Figure 5.5). Next, they used the response system to give their reasoning, being prompted to provide a linear causal response (examples in Figure 5.5b). They were then asked to join a breakout room and identify the claim, evidence, and reasoning in one

Students’ Reasoning in Chemistry Arguments and Designing Resources

83

Figure 5.5  Students  identified which substrate (A or B) was most likely to react by the first step shown. (a) Students worked together in breakout rooms to identify the claim, evidence, and reasoning in a pre-constructed argument. (b) Students identified the mode of reasoning in class examples, which had been written during the class.

answer (prepared in advance); students shared their answers back with the full class with an accompanying discussion (Figure 5.5a). Students were then shown a selection of their classmates’ responses and asked to classify each one as descriptive, relational, linear causal, or multi-component causal (Figure 5.5b). They were prompted to revise and resubmit their own answers based on the discussion. Finally, the class moved on to a new example to further practice these skills. Rubrics are another way to communicate expectations. Rubrics can be used to make explicit the expectations for a particular learning outcome— the rubric shown in Figure 5.6 could be used to guide how an instructor builds assessments and/or the teaching experience. The rubric could also be provided directly to students to show what is expected and how they will be assessed.

Chapter 5

84

Figure 5.6  A  rubric for the reasoning framework with the expectation of each question and alignment of answers with expectations.

5.4.3  R  esources for Constructively Aligning Reasoning into   a Course The Flynn Research Group website66 contains resources on how to use the reasoning, granularity, and comparisons framework and align it within a course.

5.5  Conclusions In a world facing complex global issues, citizens need to be able to make and justify decisions, an important aspect of scientific argumentation skills. As chemistry educators, we can embed scientific argumentation into our teaching and assessment to support students’ argumentation skills. Herein, we share a framework that incorporates three essential aspects to effective argumentation: reasoning (i.e., descriptive, relational, linear causal, and multicomponent causal), granularity (the scalar levels discussed in arguments), and comparisons (e.g., comparisons between possibilities, counterclaims). From our research using the framework, we found that the structure of students’ arguments varied between questions. For each question, students reasoning, granularity, and comparisons varied between tasks, but was consistent with the expectations associated with each task. Therefore, we share how constructive alignment might be used to maximize students’ opportunities to develop and demonstrate argumentationrelated learning outcomes. With this approach, intended learning outcomes are used to guide instructional decisions, including pedagogy and assessment. Future work could investigate how students perceive course expectations in more and less aligned curricula, what specific features of a course students rely on when making trying to make sense of expectations (in both aligned and misaligned curricula), how students’ arguments develop over time in different curricula, the timing and effectiveness of fading scaffolds on student argumentation in chemistry, the effects of context on students’ arguments,51,67,68 and the effects of grading practices on students’ skill development.69 Argumentation and reasoning are essential skills for both scientists and citizens in a world facing complex challenges; researching

Students’ Reasoning in Chemistry Arguments and Designing Resources

85

evidence-based educational practices, including constructive alignment, may be effective supports for students in developing and demonstrating these skills.

References 1. United Nations, Transforming our World: the 2030 Agenda for Sustainable Development [Internet], 2015. Available from: https://www.un.org/ sustainabledevelopment/sustainable-development-goals/. 2. Social Sciences and Humanities Research Council, Truth Under Fire in a Post-Fact World [Internet], 2018 [cited 1 September 2019]. Available from: https://horizons.gc.ca/en/2018/10/19/the-next-generation-ofemerging-global-challenges/#truth-under-fire. 3. National Research Council, A Framework for K-12 Science Education [Internet], National Academies Press, Washington, D.C., 2012, Available from: https://www.nap.edu/read/13165/chapter/1. 4. European Union, Recommendation of the European Parliament and of the Council of 18 December 2006 on key competences for lifelong learning, Off. J. Eur. Union, 2006, L 394/19-L 394/18. 5. Organisation for Economic Cooperation and Development, Assessing Scientific, Reading and Mathematical Literacy: A Framework for PISA 2006, 2006. 6. D. Kuhn, The Skills of Argument, New York, New York, USA, Cambridge University Press, 2011. 7. K. L. McNeill, D. J. Lizotte, J. Krajcik and R. W. Marx, Supporting Students’ Construction of Scientific Explanations by Fading Scaffolds in Instructional Materials, J. Learn. Sci., 2006 Apr, 15(2), 153–191. 8. J. F. Osborne and A. Patterson, Scientific Argument and Explanation: A Necessary Distinction? Sci. Educ., 2011, 95(4), 627–638. 9. S. Toulmin, The Uses of Argument, Cambridge University Press, Cambridge, 1958. 10. K. L. McNeill and J. Krajcik, Inquiry and Scientific Explanations: Helping Students Use Evidence and Reasoning, in Science as Inquiry in the Secondary Setting, 2008, pp. 121–34. 11. M. D. Jones and D. A. Crow, How can we use the “science of stories” to produce persuasive scientific stories, Palgrave Commun., 2017, 3(1), 1–9. 12. A. Moon, C. Stanford, R. Cole and M. H. Towns, The nature of students’ chemical reasoning employed in scientific argumentation in physical chemistry, Chem. Educ. Res. Pract., 2016, 17(2), 353–364. 13. K. L. McNeill and L. K. Berland, A Learning Progression for Scientific Argumentation: Understanding Student Work and Designing Supportive, Sci. Educ., 2010, 765–793. 14. V. Dawson and K. Carson, Using climate change scenarios to assess high school students’ argumentation skills, Res. Sci. Technol. Educ., 2017, 35(1), 1–16.

86

Chapter 5

15. L. K. Berland and B. J. Reiser, Making Sense of Argumentation and Explanation, Sci. Educ., 2009, 93, 26–55. 16. J. Emig, Writing as a Mode of Learning, Coll. Compos. Commun., 1977, 28(2), 122–128. 17. B. I. Grimberg and B. Hand, Cognitive pathways: Analysis of students’ written texts for science understanding, Int J Sci Educ, 2009, 31(4), 503–521. 18. G. J. Kelly, J. Regev and W. Prothero, Analysis of Lines of Reasoning in Written Argumentation, in Argumentation in Science Education, ed. S. Erduran and M. P. Jimenez-Aleixandre, Springer Science, 2007, pp. 137–58. 19. N. Y. P. Diana, Assessment of Argumentation in Chemistry: A Model for Designing Items, in Argumentation in Chemistry Education: Research, Policy and Practice, 2019, pp. 106–41. 20. N. E. Bodé, J. M. Deng and A. B. Flynn, Getting Past the Rules and to the WHY: Causal Mechanistic Arguments When Judging the Plausibility of Organic Reaction Mechanisms, J. Chem. Educ., 2019, 96(6), 1068–1082. 21. J. M. Deng and A. B. Flynn, Reasoning, granularity, and comparisons in students’ arguments on two organic chemistry items, Chem. Educ. Res. Pract., 2021, 22, 749–771. 22. A. Moon, R. Moeller, A. R. Gere and G. V. Shultz, Application and testing of a framework for characterizing the quality of scientific reasoning in chemistry students’ writing on ocean acidification, Chem. Educ. Res. Pract., 2019, 20(3), 484–494. 23. L. McClary and V. Talanquer, Heuristic reasoning in chemistry: making decisions about acid strength, Int. J. Sci. Educ., 2011 Jul, 33(10), 1433–1454. 24. V. Talanquer, Concept Inventories: Predicting the Wrong Answer May Boost Performance, J. Chem. Educ., 2017, 94(12), 1805–1810. 25. A. Kraft, A. M. Strickland and G. Bhattacharyya, Reasonable reasoning: multi-variate problem-solving in organic chemistry, Chem. Educ. Res. Pract., 2010, 11(4), 281–292. 26. G. Bhattacharyya and G. M. Bodner, “It gets me to the product”: How students propose organic mechanisms, J. Chem. Educ., 2005, 82(9), 1402–1407. 27. J. L. Kolodner, An Introduction to Case-Based Reasoning, Artif. Intell. Rev., 1992, 6, 3–34. 28. N. J. Nersessian, Model-Based Reasoning in Conceptual Change, in Model-Based Reasoning in Scientific Discovery, 1999, pp. 5–22. 29. M. M. Cooper, H. Kouyoumdjian and S. M. Underwood, Investigating Students’ Reasoning about Acid-Base Reactions, J. Chem. Educ., 2016, 93(10), 1703–1712. 30. R. S. Russ, R. E. Scherr, D. Hammer and J. Mikeska, Recognizing Reasoning in Student Scientific Inquiry: A Framework for Discourse Analysis Developed From Philosophy of Science, Sci. Educ., 2008, 92(3), 499–525.

Students’ Reasoning in Chemistry Arguments and Designing Resources

87

31. J. Maeyer and V. Talanquer, Making Predictions About Chemical Reactivity: Assumptions and Heuristics, J. Res. Sci. Teach., 2013, 50(6), 748–767. 32. K. E. Stanovich and R. F. West, Individual differences in reasoning: Implications for the rationality debate? Behav. Brain Sci., 2003, 26(4), 527. 33. V. Talanquer, Explanations and Teleology in Chemistry Education, Int. J. Sci. Educ., 2007, 29(7), 853–870. 34. V. DeCocq and G. Bhattacharyya, TMI (Too much information)! Effects of given information on organic chemistry students’ approaches to solving mechanism tasks, Chem. Educ. Res. Pract., 2019, 20(1), 213–228. 35. I. Caspari, D. Kranz and N. Graulich, Resolving the complexity of organic chemistry students’ reasoning through the lens of a mechanistic framework, Chem. Educ. Res. Pr., 2018, 19, 1117–1141. 36. L. Wright, Explanation and teleology, Philos. Sci., 1972, 39, 204–218. 37. M. L. Weinrich and H. Sevian, Capturing students’ abstraction while solving organic reaction mechanism problems across a semester, Chem. Educ. Res. Pract., 2017, 18(1), 169–190. 38. D. L. Schwartz, The Emergence of Abstract Representations in Dyad Problem Solving, J. Learn. Sci., 1995, 4(3), 321–354. 39. A. B. Flynn and W. W. Ogilvie, Mechanisms before Reactions: A Mechanistic Approach to the Organic Chemistry Curriculum Based on Patterns of Electron Flow, J. Chem. Educ., 2015, 92(5), 803–810. 40. M. A. R. Raycroft and A. B. Flynn, What works? What’s missing? An evaluation model for science curricula that analyses learning outcomes through five lenses, Chem. Educ. Res. Pract., 2020, 21(4), 1110–1131. 41. H. Sevian and V. Talanquer, Rethinking chemistry: a learning progression on chemical thinking, Chem. Educ. Res. Pract., 2014, 15(1), 10–23. 42. P. Moreira, A. Marzabal and V. Talanquer, Using a mechanistic framework to characterise chemistry students’ reasoning in written explanations, Chem. Educ. Res. Pract., 2019, 20(1), 120–131. 43. M. S. Carle, R. El Issa, N. Pilote and A. B. Flynn, Ten essential delocalization learning outcomes: How well are they achieved? ChemRxiv, 2021, 1–28. 44. V. Talanquer, Assessing for Chemical Thinking, in Research and Practice in Chemistry Education, Springer Nature Singapore Pte Ltd, Singapore, 2019, 2018, pp. 123–33. 45. K. W. X. Soo, The Role of Granularity in Causal Learning, University of Pittsburgh, 2019. 46. V. Talanquer, Progressions in reasoning about structure – property relationships, Chem. Educ. Res. Pract., 2018, 19(4), 998–1009. 47. M. H. W. van Mil, D. Jan, B. Arend and J. Waarlo, Modelling Molecular Mechanisms : A Framework of Scientific Reasoning to Construct Molecular-Level Explanations for Cellular Behaviour, Sci. Educ., 2013, 22(1), 93–118. 48. K. M. Southard, M. R. Espindola, S. D. Zaepfel and S. Molly, Generative mechanistic explanation building in undergraduate molecular and cellular biology, Int. J. Sci. Educ., 2017, 39(13), 1795–1829.

88

Chapter 5

49. P. L. Luisi, Emergence in Chemistry: Chemistry as the Embodiment of Emergence, Found. Chem., 2002, 4(3), 183–200. 50. L. Darden, Strategies for Discovering Mechanisms: Schema Instantiation, Modular Subassembly, Forward/Backward Chaining, Philos. Sci., 2002, 69(S3), 354–365. 51. P. G. Mahaffy, A. Krief, H. Hopf, G. Mehta and S. A. Matlin, Reorienting chemistry education through systems thinking, Nat. Rev. Chem., 2018, 2(4), 126. 52. A. H. Johnstone, Why is science difficult to learn? Things are seldom what they seem, J. Comput. Assist. Learn., 1991, 7, 75–83. 53. P. Machamer, L. Darden and C. F. Craver, Thinking about Mechanisms, Philos. Sci., 2000 Mar, 67(1), 1–25. 54. J. Biggs and C. Tang, Teaching for Quality Learning at University, The Society for Research into Higher Education, 4th edn, 2011. 55. J. Biggs, Enhancing teaching through constructive alignment, High Educ., 1996, 32, 347–364. 56. V. R. Ralph, L. J. Scharlott, C. E. Schwarz, N. M. Becker and R. L. Stowe, Beyond Instructional Practices: Characterizing Learning Environments that Support Students in Explaining Chemical Phenomena, 2022, pp. 1–35. 57. J. Biggs and C. Tang. Aligning assessment tasks with intended learning outcomes: principles, in Teaching for Quality Learning at University, 2011, pp. 191–223. 58. G. Wiggins and J. McTighe, What is backward design, Underst. Des., 1998, 1, 7–19. 59. E. O’Connor, K. Roy, E. Walsh, D. Huang, D. Y. J. Ke, E. Campbell Brown, et al., The Evaluation of an Integrated Growth & Goals Module to Better Equip Students with Learning Skills in Postsecondary Courses: Systematic, Scalable, and Explicit, June 2021. 60. M. S. Carle and A. B. Flynn, Essential learning outcomes for delocalization (resonance) concepts: How are they taught, practiced, and assessed in organic chemistry? Chem. Educ. Res. Pract., 2020, 21(2), 622–637. 61. D. Wood, J. S. Bruner and G. Ross, The Role of Tutoring in Problem Solving, J. Child Psychol. Psychiatry, 1976, 17(2), 89–100. 62. N. Graulich and I. Caspari, Designing a scaffold for mechanistic reasoning in organic chemistry, Chem. Teach. Int., 2021, 3(1), 19–30. 63. M. Rodemer, J. Eckhard, N. Graulich and S. Bernholt, Connecting explanations to representations: benefits of highlighting techniques in tutorial videos on students’ learning in organic chemistry, Int. J. Sci. Educ., 2021, 43(17), 2707–2728. 64. J. M. Deng and A. B. Flynn, Reasoning in scientific arguments [Internet], 2020 [cited 30 September 2021]. Available from: https://www.youtube. com/watch?v=4Gfg_BrHYLc. 65. J. M. Deng and A. B. Flynn. Scientific arguments: claim, evidence, reasoning [Internet], 2020 [cited 30 September 2021]. Available from: https:// www.youtube.com/watch?v=-xOWE7Zzph8. 66. Flynn Research Group, Scientific Argumentation, 2021.

Students’ Reasoning in Chemistry Arguments and Designing Resources

89

67. M. K. Orgill, S. York and J. Mackellar, Introduction to Systems Thinking for the Chemistry Education Community, J. Chem. Educ., 2019, 96(12), 2720–2729. 68. A. B. Flynn, M. Orgill, F. Ho, S. York, S. A. Matlin and D. J. C. Constable, et al., Future Directions for Systems Thinking in Chemistry Education: Putting the Pieces Together, J. Chem. Educ., 2019, 96(12), 3000–3005. 69. Ungrading: Why Rating Students Undermines Learning (and What to Do Instead), ed. S. D. Blum, West Virginia, U.S., West Virginia University Press, 2020.

Chapter 6

From Free Association to Goaldirected Problem-solving— Network Analysis of Students’ Use of Chemical Concepts in Mechanistic Reasoning† GYDE Asmussena, MARC Rodemerb, JULIA Eckhardc and SASCHA Bernholt*a a

IPN – Leibniz Institute for Science and Mathematics Education, Department of Chemistry Education, Olshausenstr. 62, 24118 Kiel, Germany; bUniversity of Duisburg-Essen, Department of Chemistry Education, Schützenbahn 70, 45127 Essen, Germany; cJustus-Liebig-University Giessen, Institute of Chemistry Education, Heinrich-Buff-Ring 17, 35392 Giessen, Germany *E-mail: [email protected]

6.1  Introduction The discipline of organic chemistry is demanding as various combinations of lines (representing bonds) and letters (representing atoms) entail a vast amount of implicit information. When encountering reaction mechanisms, multiple variables and causal relationships need to be integrated to describe, †

Electronic supplementary information (ESI) available: See Figures 6.A1–6.A6 and Table 6.A1. See DOI: 10.1039/9781839167782

  Advances in Chemistry Education Series No. 10 Student Reasoning in Organic Chemistry: Research Advances and Evidence-based Instructional Practices Edited by Nicole Graulich and Ginger Shultz © The Royal Society of Chemistry 2023 Published by the Royal Society of Chemistry, www.rsc.org

90

From Free Association to Goal-directed Problem-solving

91

explain, and predict chemical processes. Over the past two decades, research revealed several challenges students face when being enrolled in organic chemistry courses. Studies focused on students’ reasoning dealing with key mechanisms such as acid–base reactions,1,2 elimination3,4 or substitution reactions,4–8 among others.9 Students tend to focus on explicit surface features of the structural formulas,9–11 use few variables to solve problems,9 often base their claims on the stability of the final products, and neglect the reaction process.3,4 Instead of inferring implicit information, students further tend to rote-memorize10,12 and use heuristics.13 Students’ challenges can be attributed to their lack of conceptual understanding, respectively, and the ability to know in which contexts to apply certain concepts. While the tremendous amount of research on students’ mechanistic reasoning has provided a multifaceted picture of the problems and affordances students encounter, be it about specific mechanisms3,4 or about successfully applying specific concepts,10,14 substantially less is known about students’ selection of certain concepts, within and across tasks, and how these concepts are connected to each other in students’ argumentation. In the present study, a cognitive task analysis was performed analyzing students’ verbal explanations to a series of nucleophilic substitution reactions. Using comprehensive sample solutions, generated via combining organic chemistry professors’ solutions and textbook solutions, conclusions regarding students’ consideration or not-consideration of specific concepts across tasks are drawn. By elucidating students’ selection of concepts and how these are related to each other, we aim to infer implications for teaching and for supporting students in the process of inferring and appropriately using relevant concepts in the context of reaction mechanisms.

6.2  Theoretical Background 6.2.1  R  easons for Students’ Difficulties with Mechanistic Reasoning To solve a task, a learner must rely on the availability of prior knowledge. In the context of reaction mechanisms in organic chemistry, this prior knowledge comprises an understanding of multiple embedded basic concepts, like electronegativity or resonance, that need to be used as a source for goaldirected predictions.11 However, “much of what is chemistry exists at a molecular level and is not accessible to direct perception” (p. 949).15 Consequently, students need to infer the relevance of a concept from specific pictorial representations or context information. In the process of recognizing a given problem as an instantiation of a particular concept, students are often misled by superficial features of the representation or the problem context. Consequently, many students use simple associations or try to rote-memorize solution steps10,12,13 instead of inferring and selecting the concept(s) relevant for goal-directed problem-solving.16,17 Also, students’ application of concepts often varies across contexts, as the

92

Chapter 6

activation of a particular concept depends on specific visual features of the problem, even when the “deep structure” of the problem is identical,18 e.g., in terms of the same underlying reaction mechanism. This reliance on surface features is amplified by at least two aspects. First, teaching of organic chemistry courses in university contexts is often organized according to a traditional curriculum that is structured by functional groups or name reactions (e.g., reactions of alkanes, alkenes, and alkynes or carbonyl chemistry).19,20 As learning specific concepts is thereby tied to a specific content, students are challenged to apply concepts and ways of reasoning across a broader variety of contexts and content bins.21 Second, rote memorization is one of the students’ prominent strategies to process the large number of mechanisms and name reactions in organic chemistry courses.22 Accordingly, mechanisms, name reactions, and concepts are rather perceived as a series of illustrative, but unrelated examples, leading to a weak contiguity. Consequently, studies on students’ understanding of reaction mechanisms often provide a picture of very scattered knowledge elements instead of a solid and well-integrated knowledge base aimed for by their instructors.11

6.2.2  O  rganization of Knowledge Structure Through Cognitive Networks To develop a deeper understanding in organic chemistry, gaining a broad spectrum of abstract conceptual knowledge is required.11 In general, scientific concepts (like electronegativity) can be considered relational categories that relate several variables to each other.23 Acquiring such a concept enables the learner to classify problems “by their deep (common) relational structure and not (only) by superficial features” (p. 733).23 Ideally, individual aspects of a problem are not perceived separately and put together, but a more holistic picture is achieved in which a distinction is made between relevant and irrelevant information.24,25 However, goal-directed problem-solving also implies that concepts are not memorized individually, but are rather stored in organized structures.26–28 Accordingly, new knowledge must be integrated (or connected) with prior knowledge elements during learning to be available for future problemsolving.29 Phrased differently, relating knowledge elements to each other is foundational for acquiring scientific concepts and, thus, for developing expertise.23,25 From this perspective, “conceptual knowledge organization is likely to be network-like” (p. 317).30 With increasing expertise, the degree to which concepts are structured into organized networks increases simultaneously,31,32 making it possible for expert organic chemists to appropriately apply their conceptual knowledge across contexts.33,34 A common approach to visualize the organization of knowledge structures is through networks. Here, network representations take the structure and the interconnectedness of knowledge elements into account by depicting individual concepts as nodes and their relations as edges in a

From Free Association to Goal-directed Problem-solving

93

35

graph. While edges not only represent whether two or more nodes are connected at all, they can also characterize the strength of a connection with higher edge weights, i.e., with the thicknesses of the lines displayed. Conceptually, such networks are assumed to indicate the semantic interrelations between a variety of conceptual terms in the perspective of a learner and, thus, to reflect the learner’s understanding of a particular topic.31,35,36 However, it must be acknowledged that the networks are only approximations, not exact displays of students’ internal ideas and knowledge. Still, the comparison of a network based on students’ answers to a task with a network based on comprehensive research to a specific topic is an approach that allows to identify students’ difficulties and knowledge gaps regarding that topic.14,35–37 Regarding analyzing learning, the application of scale-free networks has been advocated.38 In the construction of this type of network, two underlying mechanisms are assumed to be relevant. First, a network grows by adding new nodes (e.g., when a learner encounters a new conceptual term). Second, new nodes are preferably linked to those nodes that are already connected to other nodes more densely (preferential attachment39). This latter mechanism implies that some nodes are more central than others as they function as hub or bridge concepts that support the association and integration of new nodes into the network.36,39 To identify and analyze these bridge concepts, the network parameters ‘degree centrality’ and ‘betweenness centrality’ have been proposed.40,41 Degree centrality (which is the number of edges connected to a given node) reflects the prominence of specific nodes in a network. Nodes with higher degree centrality show more relations and more intense relations to other nodes in the network. Betweenness centrality characterizes how often a given node serves as a bridge within the shortest path between two other nodes. Thus, high values for betweenness centrality indicate that a concept is not only connected to several other concepts, but it also functions as a bridge that connects other concepts that are otherwise not directly connected.

6.3  Research Questions Numerous studies indicate that organic chemistry students tend to apply few (if not single) concepts when reasoning about reaction mechanisms and that this application of concepts is often driven by explicit features of the structural formulas or the problem context.9–11 In contrast, experts can draw upon a solid and well-integrated knowledge base that supports their goal-directed activation and application of concepts and schemas to solve the problem at hand.10,34 In the present study, we employ network analysis to investigate in detail which concepts students make use of and how they connect these concepts when working on a series of mechanistic tasks. In addition, we contrast these student networks with networks based on comprehensive sample solutions, i.e., combinations of organic chemistry professors’ solutions and

94

Chapter 6

textbook solutions, to identify students’ difficulties and knowledge gaps. More specifically, we aim to address the following research questions:    1. Which chemical concepts are applied by students when solving case comparison tasks on nucleophilic substitution reactions? 2. How do students integrate multiple concepts across tasks?    Using networks aims to visualize students’ connection-making between concepts, but also to examine the organization and interrelation of concepts in the students’ explanations in contrast to comprehensive sample solutions. Identifying missing or misguided relationships in students’ answers will provide insights into designing instruction that addresses these problems.31

6.4  Method 6.4.1  Cohort The study took place at two German universities during the summer term of 2018 and was part of a broader study on students’ visual processing of representations.42 In the present study, 34 undergraduate chemistry students participated voluntarily. The participants included students enrolled in the Organic Chemistry I and Organic Chemistry II courses, as well as students who had already passed both courses to cover a broad spectrum of Organic Chemistry students. At the time of the study, all students had been taught the nucleophilic substitution mechanism and its influencing factors. The lectures of the two universities were systematically compared by the authors and can be considered comparable in terms of course style with a traditional curriculum (introducing functional groups, types of mechanisms and name reactions) and assessment style using predict-the-product tasks. Further, the lectures were held by a professor each and the teaching style was comparable as nucleophilic substitution reactions were introduced to students in frontal teaching, covering the influencing factors, e.g., introducing reactivity orders of different nucleophiles or impact of different solvents. Furthermore, the same relevant books were recommended as supporting reading at both locations. The study complied with ethical guidelines and ensured that students could opt out at any time without disadvantage. Students were informed about their rights and handling of the data. Informed consent was obtained from all students.

6.4.2  Case Comparison Tasks A series of case comparison tasks on nucleophilic substitution reactions were presented to the students. Case comparison tasks have been shown to be potentially beneficial for students’ learning. They encourage deeper reasoning and weighing of variables as students are more likely to identify relevant information through comparison.43 Thus, case comparison tasks are also particularly useful for uncovering students’ reasoning.

From Free Association to Goal-directed Problem-solving

95

Figure 6.1  Exemplary  case comparison task (Task 1). The reactions A and B dif-

fer slightly in one structural difference in the substrate. Students were asked to contrast the reactions concerning which reaction would run the fastest. A complete list of tasks can be found in the online supplement (Figure 6.A1).

The analysis includes six tasks (Figure 6.1; see Figure 6.A1 in the online supplement for a complete list of tasks). In five tasks, the two cases differed in one structural difference (tasks 1–5), e.g., different leaving groups, and in one task, two structural differences affecting the reaction rate (task 6) were displayed. Two differences were intended to increase the requirements and to cover a broader range of explanations. Influencing factors include leaving group ability, substrate effects, and nucleophilicity. In addition, a sample solution was created for each case comparison task. The sample solution was developed based on textbooks44–46 and in collaboration with professors of organic chemistry to obtain correct and comprehensive task solutions that cover all potentially relevant concepts. The procedure of solving the case comparisons was divided into two steps. In the first step, the students were asked to decide quietly for themselves which reaction runs faster. Thereby the students were given the following prompt: “Below two similar reactions, A and B, are shown. These differ slightly. Identify the differences. Decide which reaction runs faster.” In the second step, the retrospective method of thinking aloud was used to have students explain their decision: “Below you can see the same reactions as before. Describe the reactions and the differences between A and B. Let us know which reaction runs faster and explain the reasons for your decision.”

6.4.3  Data Collection and Analysis For data collection, semi-structured interviews were conducted. The students were interviewed individually, and each interview lasted approximately 20–60 min. The students’ explanations were audio recorded and transcribed

96

Chapter 6

verbatim afterwards. To answer the research questions, the transcripts were analyzed by cognitive task analysis. Cognitive task analysis is a method for studying the use of knowledge to solve tasks and uncover decision-making processes.47,48 Here, the analysis focuses on the concepts students applied and the connections students made between concepts. In the present analysis, concepts are considered to be relational categories that comprise classifications and categorizations of species (e.g., nucleophiles, solvent, Lewis base) as well as chemical principles and generalizations (e.g., resonance, electronegativity, inductive effects).23,49 In a first step, a deductive analysis was performed including codes for concepts relevant to the mechanism of nucleophilic substitution reactions (based on the sample solution).44 In a second step, inductive codes were derived which were applied by students additionally to the deductively derived expected concepts. In both steps, the application of codes was not limited to cases in which students explicitly stated a specific conceptual term but were also used when students paraphrased a concept or described properties of a concept without explicitly using appropriate terminology. However, in both steps, concepts were coded only if they were relevant for solving the case comparison task, i.e., differed in reactions A and B. For example, if the nucleophile was mentioned but it was the same molecule in reaction A and B, the code nucleophile was not assigned. The complete list of concept codes can be found in the online supplement (Table 6.A1). Based on a random selection of 20% of student answers, a mean value of 0.74 for observed agreement indicates high concordance between three raters. On closer inspection, discrepancies between raters only pertain to cases in which a concept was coded by only one rater, but not by the second (i.e., missing values in one of the ratings). Calculating Cohen’s κ based on listwise deletion (i.e., complete data)50 resulted in a value of 1, which indicates perfect agreement. Based on these values for inter-rater reliability, only one rater continued scoring the remaining answers. For the network analysis, both the sample and the student solutions were converted into semantic networks with nodes reflecting the concepts and edges reflecting the semantic relations between concepts. This process is illustrated in Table 6.1. Codes in each segment of the sample or student solution relate to each other, regardless of the total number of occurrences of a concept in this segment. Across segments, either new concepts (nodes) and connections (edges) are added or repeated interactions between concepts are indicated by higher edge weights (i.e., thicker lines in the display of the networks, see Figure 6.2), reflecting the intensity of these interactions. Based on these networks, degree centrality and betweenness centrality were calculated for each node (i.e., each concept) in the network. Both parameters take the location of the node and its connections to other nodes in the network into account, providing indications for the relative prominence of each node in terms of being involved in many (direct or indirect) ties between nodes.41 Both parameters were normalized by network size for further calculations in order to adjust for different numbers of nodes in networks based on different tasks or data sources (i.e., sample solution vs. student answers).

concepts.

Example quote (Referring to task 1, Figure 6.1)

Concept codes

In the SN2 mechanism, the transition state corresponds to the point of highest energy. In the transition state, the formation of the new σ-bond between nucleophile and nucleophilically attacked carbon atom as well as the breaking of the old σ-bond between nucleophilically attacked carbon atom and leaving group has not completely taken place. 1-Chlorbutane is sterically little hindered, which has a positive effect on the relative reaction rate.

Transition state Nucleophilic attack Substrate structure Steric hindrance Primary substrate

Exemplary network

From Free Association to Goal-directed Problem-solving

Table 6.1  For  each task solution, codes are derived to categorize the named concepts. The networks are developed based on these

97

Chapter 6

98

Figure 6.2  Integrated  networks of all six sample solutions (top) and all student solutions to the six case comparison tasks (bottom). Colors indicate subnets of the network (so called communities) that have a relatively large number of internal ties (i.e., between the nodes of the subnet) and few ties to nodes from other subnets, as determined on basis of the integrated sample solution (top). Individual networks of sample and student solutions to each of the six tasks can be found in the online supplement (Figure 6.A2 and Figure 6.A3).

Technically, the network analysis was performed using R 4.1.0 51 and several packages, notably tnet 3.0.16,52 igraph 1.2.6,53 tm 0.7–8,54 and tidyverse 1.3.1.55

6.5  Results Regarding the sample solutions, 42 chemical concepts were identified and coded across the six tasks (see Table 6.A1 in the online supplement for a complete list of codes). While some of these concepts are relevant when

From Free Association to Goal-directed Problem-solving

99

solving all six tasks (e.g., electronegativity or nucleophilic attack), some concepts correspond to one of the three major factors (substrate, nucleophile, and leaving group) that influence the reaction rate of nucleophilic substitution reactions. Consequently, these concepts are more prevalent in those tasks, in which these factors are implemented as the explicit difference to be contrasted in the case comparison task. For instance, concepts like leaving group ability or bond strength to the substrate only appear in the sample solution of task 2 and task 5, which ask to compare the influence of two different leaving groups on the speed of the reaction (see dark blue area in Figure 6.3, middle). When comparing the different networks of sample and student solutions (Figure 6.3), the number of concepts incorporated in the student problem solutions is substantially lower than in the sample solutions. When directly contrasting the sample solution to the individual student solutions, the ratio between incorporated and expected concepts ranges from 0.15 to 0.37 (with a mean ratio of 0.24) across the six tasks. The integrated student networks (i.e., the networks of all students to a specific task), however, comprise a substantially larger amount of the concepts addressed in the sample solution (ranging from 0.53 to 0.90, with a mean ratio of 0.72). Across tasks, the individual student solutions partly overlap, as indicated by specific subnets in the individual tasks that comprise the concepts and their relations that seem to be most prominent to students (see colored areas in Figure 6.3). When visually comparing sample and student solutions across the different tasks (see also Figure 6.A2 and Figure 6.A3 in the online supplement for a complete overview of networks across tasks), a certain degree of resemblance is obvious. The overall structure of students’ integrated networks follows a comparable pattern like the sample solutions, both in close correspondence to the structural differences that were incorporated in the different case comparison tasks. When focusing on the specific concepts that students incorporated in their problem solutions, some concepts are applied more often (across students and tasks), e.g., leaving group, nucleophile, and stability, while other concepts are rarely employed, e.g., activation energy, transition state, or basicity. Considering the correctness of a student’s use of a particular concept, correct use of a particular concept appears to be more likely for concepts that students used more frequently. Based on a linear regression, the proportion of correct application significantly increases with the frequency a concept is applied (R2 = 0.15, F(1, 54) = 9.51, p = 0.003, β = 0.39, 95% CI [0.14, 0.64], p = 0.003; for a visualization of the distribution see Figure 6.A4 in the online supplement). To further substantiate the analysis of the prominence and integration of specific concepts in the student solutions beyond the focus on frequencies of application, two common network parameters are considered to contrast student and sample solutions: degree centrality and betweenness centrality. When applying degree centrality to the sample and student solutions, most concepts show only low values, indicating low prominence of these concepts across tasks. However, discrepancies between student and sample solutions

100

Chapter 6

Figure 6.3  Exemplary  semantic networks of sample (left) and student (right)

solutions to case comparison tasks 4, 5 and 6, aiming to contrast the influence of different substrates, different leaving groups, or different nucleophiles, respectively, on the speed of the reaction. Colors indicate subnets of the network that share many internal ties and few external ties, as determined on basis of the integrated sample solution (Figure 6.2, top). Networks of sample and student solutions to all six tasks can be found in the online supplement (Figure 6.A2 and Figure 6.A3).

From Free Association to Goal-directed Problem-solving

101

become obvious regarding the more prominent concepts in both networks (Figure 6.4). In the student solutions, the concepts −I effect, electronegativity, electronpushing and electron-withdrawing substituents, partial positive, nucleophilic attack, and stability show the highest values for degree centrality. Also, most of these and several other concepts show substantially higher values for degree centrality when compared to the sample solutions (Figure 6.4, left). In contrast, only some of these concepts (electronegativity, nucleophilic attack, −I effect) also show high values of degree centrality in the sample solution, but other concepts (transition state, basicity, activation energy, Hammond postulate, and substrate structure) show high degree centrality in the network of the sample solutions. The second network parameter, betweenness centrality, reflects a more functional perspective: high values indicate a bridge function of a particular concept in terms of connecting other concepts. Thus, these concepts are positioned in a critical location of the network as they might control the co-activation of related concepts. Under this perspective, the concepts electronegativity, −I effect, stability, and electron-pushing and electron-withdrawing substituents (which also show high values for degree centrality; see above), but also leaving group, leaving group capacity, and substrate structure can be considered structurally important in the student networks. In the sample solutions, electronegativity, inductive effects (both −I/+I), transition state, nucleophilic attack, solvation, nucleophilicity, substrate structure, and basicity can be considered important (Figure 6.5). While some concepts show high values for betweenness centrality in both sample and

Figure 6.4  Normalized  degree centrality of concepts in sample and student solu-

tions, ordered by the difference in degree centrality between the sample and student solutions. Only concepts with degree centrality values above 0.02 in either sample or student solutions are displayed (see Figure 6.A5 in the online supplement for a figure comprising all coded concepts).

Chapter 6

102

Figure 6.5  Normalized  betweenness centrality of concepts in sample and student

solutions, ordered by the difference in betweenness centrality between the sample and student solutions. Only concepts with betweenness centrality values above 0.02 in either sample or student solutions are displayed (see Figure 6.A6 in the online supplement for a figure comprising all coded concepts).

student networks (e.g., inductive effects, electronegativity), discrepancies in the relative importance are most prevalent for basicity, substrate structure, nucleophilicity, and solvation (with higher values in the sample solutions) as well as stability (with higher values in the student solutions). To illustrate how these patterns in abstract network parameters manifest in the student and sample solutions, Table 6.2 shows two example quotes. Both quotes pertain to the final step of each argumentation, when coming to a decision about which reaction in the case comparison (task 6; see Figure 6.A1 in the online supplement) proceeds faster. In the student solution, only few concepts are considered (as it was also the case in the full solution preceding this quote) and these concepts mainly pertain to structural features of the reactants. While these concepts are also relevant in the sample solution, these concepts are complemented by more general chemical principles and rules. More importantly, these additional concepts are not only related to the codes reflecting structural features, but they organize the whole argumentation pattern by providing a bridge between structure and energy.4,56,57

6.6  Discussion and Conclusions The analysis of students’ explanations to the series of case comparison tasks indicated that students often apply only very few concepts to substantiate their reasoning. On average, the sample solution, i.e., organic chemistry professors’ solutions combined with textbook solutions, covered four times more concepts than students made use of. This finding concurs with evidence

From Free Association to Goal-directed Problem-solving

103

Table 6.2  Exemplary  quote from a student statement (top line) and from the sample solution (bottom line) to decide between the two reactions in the case comparison of task 6 (cf. Figure 6.A1).

Example quotes Student solution: […] Although one could also say that the molecule CS is in itself smaller than the carbon with two methyl groups and one oxygen, so to speak. If one assumes that, then A should proceed more quickly. Expert solution: […] Reaction A proceeds preferentially. The oxygen nucleophile is sterically more demanding due to its branching, which means that this nucleophile can approach the back side of the carbon atom less well, which also manifests itself in a higher potential energy of the transition state in comparison to A. Since the transition state of an SN2 reaction, a trigonal–bipyramidal transition state, is sensitive to steric hindrance, beta atoms, as with isopropylate, have a slowing effect on the reaction rate due to repulsive interactions.

Coded concepts Nucleophile steric hindrance

Nucleophile Steric hindrance Nucleophilic attack Transition state Activation energy Secondary substrate

from prior studies.9,14 The integrated networks across all student answers comprised a substantially larger number of concepts (with a mean ratio of 0.72 when compared to the sample solutions). As the higher values for the integrated networks are expectable, i.e., assuming students collectively to be nearer to the comprehensive sample solution than individual students, this finding also supports the validity of the analytical approach. In addition, the high coverage of concepts in the integrated student solutions also indicates that most concepts seem to be available in principle at the cohort level, but that individual students consider different concepts relevant for the respective tasks, which leads to a great variety of different solutions. Across the six tasks, specific structural differences had to be compared in the case comparisons. In concordance with the sample solution, students’ integrated networks reveal a comparable pattern of applied concepts that correspond to structural differences students were prompted to compare. Consequently, while students’ individual answers differ in terms of the specific concepts that are incorporated, the activation of particular concepts in the students’ explanations cannot be considered arbitrary but being related to specific visual characteristics students notice in the task. Characteristics being related to the explicit differences seem to be especially salient to students, resulting in specific activation patterns of concepts across individuals.18 However, additional explicit surface features of the tasks (e.g., the solvent or ring structures) seem to vary in their potential to attract the individual student’s attention, resulting in varying co-activation patterns of concepts across students.9,10

104

Chapter 6

When further considering the correctness of how a specific concept is applied by a student in her/his/their solution, the range of proportions of correct applications is higher in concepts that were employed more often (see Figure 6.A4 in the online supplement). This relationship indicates a kind of familiarity of students with certain concepts, which makes them also more likely to recognize a task with certain structural features as an instantiation of a specific concept and to apply this concept correctly when working on the task.18,23 These findings support the assumption that some concepts are more central than others in terms of their position and function in the network.35 These more prominent and highly integrated concepts are assumed to have gained this position based on repeated successful application and, also, to influence the co-activation of connected concepts in the network.39 However, the present analysis cannot answer whether familiarity or salient features of the case comparison (e.g., specific functional groups in the structural formulae) play out stronger in activating specific concepts, especially in the case of concepts which were applied seldom across tasks. When further comparing the prominence of concepts in the sample and student solutions (in terms of degree centrality), only a few concepts (namely electronegativity and nucleophilic attack) showed high degree centrality in both types of networks. When further comparing the lists of the most prominent concepts in the two networks (sample vs. student solutions), the prominent concepts in the sample solutions can be considered rather general chemical principles and rules that are relevant to the mechanism of nucleophilic substitution reactions (activation energy, basicity, Hammond postulate, substrate structure, and transition state), while the prominent concepts in the student solutions mainly reflect structure–property relationships (charge distribution based on differences in electronegativity, i.e., electron-pushing and electron-withdrawing substituents, partial positive, or −I effect) and rather general categories (stability) that often reflect common student heuristics and short-cuts.10,12,13 Concepts pertaining to structure–property relationships of course occur in the sample solutions as well, as these concepts are also relevant, but the more general rules and principles seem to play a structuring role only in the more comprehensive sample solutions. However, the present analysis is based only on single comprehensive sample solutions (i.e., solutions of organic chemistry professors in combination with textbook solutions), but it can be assumed that expert task solutions certainly also differ in the extent to which they draw on a broader range of concepts and how they structure their explanations.57 Regarding the second network parameter, betweenness centrality, some concepts show high values in both sample and student networks (e.g., inductive effects, electronegativity). However, discrepancies in the relative importance are most prevalent for basicity, nucleophilicity, and solvation (favored in the network of the sample solutions) as well as electron-withdrawing substituents and stability (favored in the network of the student solutions). This finding might be related to the nature of the sample solutions, since organic chemistry professors’ (and textbook) solutions present a topic in ways that

From Free Association to Goal-directed Problem-solving

105

align with the disciplinary framing. Hence, they create a context through giving insights to the broader reaction conditions, i.e., regarding participating entities and their capacities, such as nucleophiles/nucleophilicity and solvents/solvation. Furthermore, it was shown that when organic chemistry professors were asked to give a mechanistic explanation in a teaching context, they sought several conceptual approaches inferring multiple concepts,57 which relates to the increased number of concepts compared to students’ networks. When regarding a task with different leaving groups, they reasoned about leaving group ability but further included basicity as a concept to their line of reasoning. The latter, thus, corresponds to the prominence of the concept “basicity” in the sample solution. In consequence, rather general chemical principles and rules serve a structuring function in the sample networks, while structure–property relationships and rather general categories take over this position in the student networks.

6.6.1  Implications for Teaching Taken together, the findings of the present study indicate that some concepts are only seldom applied by students, although relevant in multiple tasks. While concepts pertaining to structure–property relationships are a relevant resource to students when working on nucleophilic substitution reactions, the more general chemical principles and rules are not very prominent in students’ networks and, thus, are also not able to guide the co-activation of additional concepts and to structure students’ answers (in contrast to the sample solutions). Here, innovative approaches to restructuring traditional courses and curricula20,58 might support students in transferring concepts beyond the content they were first introduced. Repeated application of concepts across multiple problems is necessary to enable students to generalize and de-contextualise these general principles and concepts, which in turn should also be reflected by a more prominent position of these concepts in students’ derived knowledge networks.31,35,37 However, numerous studies have repeatedly shown that knowledge acquired from contextualized learning does not easily transcend the learning context,59,60 calling for explicit scaffolding and deliberate practice.59 Here, the item format of case comparison tasks, as used in this study, might be beneficial in shifting students’ attention towards the chemical process of a reaction61 and in engaging them to reason about concepts and their influences related to purposefully set differences between two similar reactions. This approach is known to support students’ schema acquisition and development of transferable knowledge.43,62 While the findings presented here are mainly descriptive in nature, some implications regarding supporting students’ learning in organic chemistry might be derived, at least under the assumption that students’ learning should lead them to consider more variables and use more integrated concepts when approaching tasks such as those employed in this study. When comparing sample and student solutions, students seem to need additional support in considering multiple concepts instead of focusing on single variables and concepts.9

106

Chapter 6

Considering multiple concepts, however, also implies a more coherent strategy for selecting the concepts that are assumed relevant for the problem at hand. On the one hand, students need to be supported in looking beyond the surface features of the given representations or problem context and to develop more analytical procedures for evaluating and categorizing various features of the problem and, thus, to derive relevant concepts.18 On the other hand, specific concepts which could serve an anchoring function to activate further concepts from students’ prior knowledge might be emphasized more explicit during teaching. For instance, better training students to perceive structural features of a given molecule to evaluate (implicit) concepts, e.g., whether a species potentially acts as a nucleophile might lead students to further think about nucleophilicity and possible sites for a nucleophilic attack, before finally considering steric hindrance to influence this process, instead of directly relating (salient) structural features to steric hindrance, without considering any particular chemical process.14 Strengthening the perception of such key concepts like nucleophilicity might support students in better understanding these concepts, but also in developing such meaningful activation chains of related concepts. Thinking about activation chains or the co-activation of relevant concepts could also be fruitful to better integrate energetic considerations into students’ reasoning. When comparing structural features in terms of their influence on reaction rates, energy-related concepts (e.g., activation energy, transition state, or the Hammond postulate) are necessary to conclude. In the present study (and in line with findings in prior studies4,63), however, these concepts are hardly integrated into students’ answers. Here, more explicit linking structural to energetic considerations seems necessary to better provide bridges to these ‘rare’ concepts. Specifically, ‘stability’ might be such a bridging concept. While stability is often used without a clear meaning or ambiguously by students as a decision criterion for comparing reaction rates, results on the betweenness centrality of concepts in this study underline the prominence of ‘stability’ in the students’ answers, i.e., the concept ‘stability’ already functions as a bridge that connects other concepts that are otherwise not directly connected. Instruction that aims to further clarify the different conceptual meanings (in terms of a thermodynamic or a kinetic interpretation) could support students in making more differentiated use of this concept and to also attach additional, energy-related concepts to this hub, which then might bridge these energetic considerations to the structural-related concepts already attached to it. While case comparison tasks are a fruitful approach to encourage deeper reasoning and weighing variables,61 additional scaffolding might be necessary to support students in activating, selecting, and incorporating relevant concepts in their reasoning.

Acknowledgements We would like to thank the German Research Foundation DFG (Deutsche Forschungsgemeinschaft) for funding this research (project number: 329801962).

From Free Association to Goal-directed Problem-solving

107

References 1. M. M. Cooper, H. Kouyoumdjian and S. M. Underwood, J. Chem. Educ., 2016, 93(10), 1703. 2. J. A. Schmidt-McCormack, J. A. Judge, K. Spahr, E. Yang, R. Pugh, A. Karlin, A. Sattar, B. C. Thompson, A. R. Gere and G. V. Shultz, Chem. Educ. Res. Pract., 2019, 20(2), 383. 3. D. Cruz-Ramírez de Arellano and M. H. Towns, Chem. Educ. Res. Pract., 2014, 15(4), 501. 4. I. Caspari, D. Kranz and N. Graulich, Chem. Educ. Res. Pract., 2018, 19(4), 1117. 5. N. E. Bodé, J. M. Deng and A. B. Flynn, J. Chem. Educ., 2019, 96(6), 1068. 6. O. M. Crandell, M. A. Lockhart and M. M. Cooper, J. Chem. Educ., 2020, 97(2), 313. 7. A. J. Dood, J. C. Dood, D. Cruz-Ramírez de Arellano, K. B. Fields and J. R. Raker, J. Chem. Educ., 2020, 97(10), 3551. 8. F. M. Watts, J. A. Schmidt-McCormack, C. A. Wilhelm, A. Karlin, A. Sattar, B. C. Thompson, A. R. Gere and G. V. Shultz, Chem. Educ. Res. Pract., 2020, 21(4), 1148. 9. A. Kraft, A. M. Strickland and G. Bhattacharyya, Chem. Educ. Res. Pract., 2010, 11(4), 281. 10. G. Bhattacharyya and G. M. Bodner, J. Chem. Educ., 2005, 82(9), 1402. 11. N. Graulich, Chem. Educ. Res. Pract., 2015, 16(1), 9. 12. R. S. DeFever, H. Bruce and G. Bhattacharyya, J. Chem. Educ., 2015, 92(3), 415. 13. J. Maeyer and V. Talanquer, J. Res. Sci. Teach., 2013, 50(6, SI), 748. 14. M. E. Anzovino and S. L. Bretz, Chem. Educ. Res. Pract., 2016, 17(4), 1019. 15. R. B. Kozma and J. Russell, J. Res. Sci. Teach., 1997, 34(9), 949. 16. M. T. H. Chi, P. J. Feltovich and R. Glaser, Cogn. Sci., 1981, 5, 121. 17. K. R. Galloway, M. W. Leung and A. B. Flynn, Chem. Educ. Res. Pract., 2019, 20(1), 30. 18. M. T. H. Chi and K. A. VanLehn, Educ. Psychol., 2012, 47(3), 177. 19. J. R. Raker and T. A. Holme, J. Chem. Educ., 2013, 90(11), 1437. 20. A. B. Flynn and W. W. Ogilvie, J. Chem. Educ., 2015, 92(5), 803. 21. V. Talanquer, J. Chem. Educ., 2018, 95(11), 1905. 22. N. P. Grove and S. Lowery Bretz, Chem. Educ. Res. Pract., 2012, 13(3), 201. 23. M. B. Goldwater and L. Schalk, Psychol. Bull., 2016, 142(7), 729. 24. P. Benner, Bull. Sci. Technol. Soc., 2004, 24(3), 188. 25. S. E. Dreyfus, Bull. Sci. Technol. Soc., 2004, 24(3), 177. 26. J. D. Novak and B. Gowin, Learning How to Learn, Cambridge Univ. Press, Cambridge, 1999. 27. I. M. Greca and M. A. Moreira, Int. J. Sci. Educ., 2000, 22(1), 1. 28. P. N. Johnson-Laird, Mental Models: Towards a Cognitive Science of Language, Inference, and Consciousness, Harvard Univ. Press, Cambridge, 1995. 29. K. R. Koedinger, A. T. Corbett and C. Perfetti, Cogn. Sci., 2012, 36(5), 757. 30. A. Gupta, D. Hammer and E. F. Redish, J. Learn. Sci., 2010, 19(3), 285.

108

Chapter 6

31. J. D. Novak, Learning, Creating, and Using Knowledge: Concept Maps as Facilitative Tools in Schools and Corporations, Routledge, New York, NY, 2010. 32. S. Lowery Bretz, J. Chem. Educ., 2001, 78(8), 1107. 33. G. Bhattacharyya and G. M. Bodner, J. Res. Sci. Teach., 2014, 51(6), 694. 34. K. R. Galloway, M. W. Leung and A. B. Flynn, J. Chem. Educ., 2018, 95(3), 355. 35. I. T. Koponen and M. Nousiainen, Phys. A, 2018, 495, 405. 36. O. Daems, M. Erkens, N. Malzahn and H. U. Hoppe, J. Comput. Educ., 2014, 1(2–3), 113. 37. S. Podschuweit and S. Bernholt, Educ. Sci., 2020, 10(4), 103. 38. M. J. Jacobson and M. Kapur, in Proceedings of the 9th International Conference of the Learning Sciences – Volume 2, International Society of the Learning Sciences, 2010, p. 193. 39. A.-L. Barabási, R. Albert and H. Jeong, Phys. A, 1999, 272(1–2), 173. 40. Network Analysis: Methodological Foundations, ed. U. Brandes and T. Erlebach, Springer, Berlin, 2005. 41. D. Luke, A User’s Guide to Network Analysis in R, Springer International Publishing, Cham, 2015. 42. M. Rodemer, J. Eckhard, N. Graulich and S. Bernholt, J. Chem. Educ., 2020, 97(10), 3530. 43. L. Alfieri, T. J. Nokes-Malach and C. D. Schunn, Educ. Psychol., 2013, 48(2), 87. 44. P. Y. Bruice, Organic Chemistry, Pearson, Boston, 2014. 45. Organische Chemie, ed. K. P. C. Vollhardt, N. E. Schore, H. Butenschön and K.-M. Roy, Wiley-VCH, Weinheim, 2011. 46. J. Clayden, N. Greeves and S. G. Warren, Organic Chemistry, Oxford University Press, Oxford, 2012. 47. S. E. Gordon and R. T. Gill, in Naturalistic Decision Making, ed. C. E. Zsambok and G. Klein, Psychology Press, 2014, p. 131. 48. E. Salas and J. A. Cannon-Bowers, Annu. Rev. Psychol., 2001, 52, 471. 49. A Taxonomy for Learning, Teaching, and Assessing: A Revision of Bloom’s Taxonomy of Educational Objectives, ed. L. W. Anderson and D. R. Krathwohl, Longman, White Plains, NY, 2001. 50. A. de Raadt, M. J. Warrens, R. J. Bosker and H. A. L. Kiers, Educ. Psychol. Meas., 2019, 79(3), 558. 51. R Core Team, R: A Language and Environment for Statistical Computing, Vienna, Austria, 2021. https://www.R-project.org/. 52. T. Opsahl, Structure and Evolution of Weighted Networks, University of London (Queen Mary College), London, UK, 2009. 53. G. Csardi and T. Nepusz, Int. J. Complex Syst., 2006, 1695, 1–9. 54. I. Feinerer, K. Hornik and D. Meyer, J. Stat. Softw., 2008, 25(5), 1. 55. H. Wickham, M. Averick, J. Bryan, W. Chang, L. McGowan, R. François, G. Grolemund, A. Hayes, L. Henry, J. Hester, M. Kuhn, T. Pedersen, E. Miller, S. Bache, K. Müller, J. Ooms, D. Robinson, D. Seidel, V. Spinu, K. Takahashi, D. Vaughan, C. Wilke, K. Woo and H. Yutani, J. Open Source Softw., 2019, 4(43), 1686.

From Free Association to Goal-directed Problem-solving

109

56. W. M. Goodwin, Found. Chem., 2008, 10(2), 117. 57. J. Eckhard, M. Rodemer, A. Langner, S. Bernholt and N. Graulich, Chem. Educ. Res. Pract., 2022, 23(1), 78. 58. M. M. Cooper, R. L. Stowe, O. M. Crandell and M. W. Klymkowsky, J. Chem. Educ., 2019, 96(9), 1858. 59. S. B. Day and R. L. Goldstone, Educ. Psychol., 2012, 47(3), 153. 60. D. L. Schwartz and J. D. Bransford, Cogn. Instr., 1998, 16(4), 475. 61. N. Graulich and M. Schween, J. Chem. Educ., 2018, 95(3), 376. 62. J. Roelle and K. Berthold, Cogn. Instr., 2015, 33(3), 199. 63. N. M. Becker and M. M. Cooper, J. Res. Sci. Teach., 2014, 51(6), 789.

Chapter 7

Epistemic Stances in Action— Students’ Reasoning Process While Reflecting About Alternative Reaction Pathways in Organic Chemistry Leonie Liebera and Nicole Graulich*a a

Justus-Liebig-University, Institute of Chemistry Education, Heinrich-BuffRing 17, 35392 Giessen, Germany *E-mail: [email protected]

7.1  Introduction In chemistry education research, the “product” of a reasoning process in terms of performance and outcome are more the centre of attention and less often how a reasoning “process” unfolds. Hammer et al. stated that how it happens is much more important than whether students apply knowledge from one reasoning context to another.1 While determining students’ use of knowledge is important, for example, as part of diagnosing their learning and providing support for meaningful reasoning processes,2 epistemology, as the knowledge about the nature and epistemic justification of knowledge, is an inherent aspect of students’ reasoning processes as well. Focusing on students’ epistemologies helps us gain a deeper understanding of how   Advances in Chemistry Education Series No. 10 Student Reasoning in Organic Chemistry: Research Advances and Evidence-based Instructional Practices Edited by Nicole Graulich and Ginger Shultz © The Royal Society of Chemistry 2023 Published by the Royal Society of Chemistry, www.rsc.org

110

Epistemic Stances in Action—Students’ Reasoning Process

111 3

knowledge is experienced and viewed by students. Epistemology refers to reflecting on beliefs about knowledge, but also reflecting on beliefs for strategies for solving problems and reasoning about those problems.4 Knowledge and epistemology are thus closely linked. On the one hand, an epistemic stance is expressed, for example, through various ways of considering the value and application of knowledge. On the other hand, acquiring knowledge in turn also invokes one’s reasoning about the nature of this knowledge.4 By explicitly looking at epistemology, it is also possible to elicit, among other things, the extent to which students are critical about statements, how they make judgments about claims, and how they justify them with evidence and reasoning.4 For this reason, looking at students’ reasoning processes, taking students’ epistemic stances into account, might be a way to view and characterize students’ approaches in problem-solving more holistically. Using such a process lens is also increasingly applied in chemistry education research. Rodriguez et al., for example, analyzed the problem-solving process of physical chemistry students and characterized their (un)productive features.5 Investigating students’ problem-solving process with a process-oriented lens gives additional power to the analysis of students’ answers because the use of (un)productive features is helpful to ascertain an individual’s problemsolving approach. Kelly et al. investigated the solving process of general chemistry students by applying Hammer and Elby’s resources framework and the epistemic stances.6 The goal of Kelly et al. was to characterize students’ understanding of the atomic level. What stands out in both studies is that incorporating students’ epistemologies in the analysis shifts the focus from the (in)correct use of conceptual knowledge towards how students’ approach and experience the knowledge pieces they have to use in the problem-solving process. Combining an epistemic lens with a process-oriented lens may be an alternative perspective on understanding student behaviour while problemsolving to elicit students’ needs and derive supportive strategies for teaching.7 The goal is less a correction of epistemic beliefs or stances and more a way to figure out how students’ epistemic stances are (productively) influencing their reasoning process and what actions trigger them.8

7.1.1  Reasoning in Students’ Argumentation In chemistry, building arguments is a necessary skill to estimate reaction processes and a core ability in science.9 More explicitly, while solving tasks about reaction mechanisms in organic chemistry, students need to argue and reason about structural changes occurring between involved reagents and the cause of these changes. To explain causes, students need to be able to justify their evidence to explain why reaction steps occur with reasoning.10 However, building complete arguments and appropriately use scientific principles to justify claims is challenging.11 As a result, students often use unlinked and sometimes erroneous pieces of knowledge12 as well as one-reason decision-making based on single pieces of evidence.11,13 Building

112

Chapter 7

arguments has been a prominent field of research, for example in physical chemistry14 and organic chemistry.15 Toulmin’s argumentation pattern is the one that is predominantly used. As the key components claim, data, and warrant are present in every argument,16 the claim-evidence-reasoning model (CER) as a simplified version of Toulmin’s argumentation pattern, is applied as well.11 The claim is the basis of an argument, which acts as a statement to a problem11 and is always in doubt, meaning that a justification is required.9 Evidence is composed of scientific data and serves to support the claim. Reasoning functions as the justification and is the most difficult part of an argument because reasoning needs to provide a bridge between claim and evidence.11

7.1.2  Toward an Understanding of Epistemic Stances From a philosophical perspective, a stance is a mixture of strategies and behaviours that are important for the formation of beliefs.3,17 Epistemic stances do not in themselves make claims about particular aspects but are the way one makes claims or expresses thoughts about knowledge. Therefore, stances are not beliefs, but are adopted and expressed through human actions, making them neither wrong nor right.3 Thus, it is also possible to experience contrary stances.18 However, epistemic stances are usually implicit; one does not use them intentionally.17 Looking at epistemic stances from an educational perspective allows to understand how these stances can negatively or positively influence the learning process and thus allows to draw inferences about students’ conceptual understanding.19 From a science education point-of-view, there are different ways to approach epistemologies. For example, students’ epistemologies can be thought of as beliefs that mainly consist of stable cognitive structures.20 Research based on this conceptualization is more likely to insist on changing these beliefs.21 Another understanding is that epistemologies consist of fine grained resources and are to be distinguished from beliefs.22 Epistemologies are thus understood by Hammer and Elby as a contextual activation of resources, whereby they are neither right nor wrong, but can be used more or less productively.22 Besides other applications of resources such as the nature of knowledge (e.g., knowledge as fabricated stuff), Hammer and Elby defined a possible process-oriented lens as “resources for understanding stances one may take toward knowledge” (p. 12).22 This includes, for example, regarding understanding as an experience of an idea that seems to make sense, causes belief or disbelief as a reflection of understanding, and doubting as an expression of uncertainty because a piece of information is neither rejected nor accepted.22 Investigating when stances occur in the learning context and how students experience those stances allows to look at students’ reasoning process in a more holistic way. Combining the perspective on students’ argumentation (i.e., in terms of using claim-evidence-reasoning) with the perspective on students’ epistemic stances offers the chance for a more comprehensive understanding of an individual’s reasoning approach.

Epistemic Stances in Action—Students’ Reasoning Process

113

7.2  Research Questions As research has shown that epistemic stances influence how students are using and reflecting about knowledge, we wanted to investigate how epistemic stances are enacted (i.e., expressed in action while students are reasoning) in the context of students reflecting about the plausibility of alternative reaction pathways. The presented case study is thus guided by the following two research questions.    1. How are epistemic stances in action and argument components (claim–evidence–reasoning) linked in students’ reasoning process? 2. How are epistemic stances in action related to turning points in students’ reasoning process?

7.3  Study Design and Methods The following case study is part of a study that was conducted at a German university in fall 2019.15,23 Twenty-nine chemistry major students participated on a voluntary basis. They were recruited at the beginning of the Organic Chemistry III course, which requires a successful completion of Organic Chemistry I and II. Each reaction mechanism and the associated chemical concepts of the research instrument were discussed multiple times throughout their courses beforehand. Prior to the interviews, students were informed about their rights, gave their written permission, and had the opportunity to stop the interview at any point. For this case study presented herein, we analyzed the judgement of plausibility for the reaction of 4-chlorobutanol and hydroxide to tetrahydrofuran (THF, see Figure 7.1) of two students, whom we call Taylor and Robin. We chose to use gender-neutral names and refer to the students by using the pronouns they/them/their. Taylor and Robin were chosen to represent the cohort because both students experienced well identifiable turning points and various epistemic stances. Moreover, Taylor is a student who has good conceptual knowledge whereas Robin is conceptually weaker. The research instrument consisted of four tasks (see Figure 7.1).23 First, students were asked to form the product of a well-known organic chemical reaction. In the second task, students were asked to build the product of a

Figure 7.1  Outline  of the research instrument with the product cards discussed by the students in this case study highlighted in green.

Chapter 7

114

reaction that differed from the first reaction only by one additional structural feature. In the third task, the students were given five alternative reaction products for the reaction shown in the second task and were prompted to decide as to whether the respective reaction product seemed (im)plausible to them. Lastly, the students were asked again if they would like to change their own reaction product from the second task.

7.3.1  Data Analysis The two students’ transcripts selected underwent a thematic content analysis. All steps of the analysis were discussed with the co-author multiple times. The analysis took place in two separate steps. In the first step, the argumentation process of the participants was analyzed. This was based on a simplified version of Toulmin’s argumentation model (claim-evidence-reasoning), as described in Lieber and Graulich.15 A claim was coded as the position being argued for, which was either that the reaction product is plausible or implausible. To determine statements as evidence or reasoning, the structure of students’ arguments was considered. If one supported a claim, e.g., with an answer why a certain claim is plausible or implausible, the statement was identified as evidence. If one justified evidence, e.g., with a statement such as why a specific leaving group is good or bad, the statement was coded as reasoning. The second part of the data analysis was categorized deductively using the epistemic stances described by Hammer and Elby.22 The categories consist of the stances belief, disbelief, doubting, understanding, acceptance, and puzzlement (see Table 7.1). However, as students are not explicitly referring to or are aware of these stances, we were focusing on how a stance was expressed in action by the students (i.e., utterances describing an experienced emotion) (see Table 7.1). The appropriateness and correctness of students’ statements (i.e., arguments and stances) was not the focus of the analysis. Table 7.1  Description  of the six epistemic stances by Hammer and Elby22 and the code description that guided the data analysis.

Category

Code description

Belief

Students express that a given piece of information seems reasonable. Students express that a given piece of information does not seem reasonable. Students express neither accepting nor rejecting a given piece of information. Students express understanding that an experience of an idea makes sense. Students express the experience of believing an idea, which does not necessarily mean that they understand the idea. Students express the experience that an idea does not make sense.

Disbelief Doubting Understanding Acceptance Puzzlement

Epistemic Stances in Action—Students’ Reasoning Process

115

7.4  Results and Discussion The following case description illustrates in depth which type of stances, turning points and components of the CER argumentation model are occurring and co-occurring in students’ problem-solving process while reflecting about the plausibility of THF in the reaction of 4-chlorobutanol with hydroxide.

7.4.1  Case 1—Taylor Taylor is a student who quickly got into the interview situation. The idea of a think-aloud and expressing the thought process was well understood by the student. Taylor was patient, communicative, and contributed to a positive atmosphere during the interview. The sequence below shown in Figure 7.2,

Figure 7.2  Chronological  sequence of Taylor’s interview sequence, the stances in action and components of the argumentation while reasoning about THF as a plausible reaction product.

116

Chapter 7

illustrates Taylor’s process of solving the task. Reactants and products have been presented to Taylor in terms of a product card. The task in this respective section was to make a reasoned decision for the plausibility of the product THF. It is noticeable that Taylor experienced several stances of doubting that are followed by beliefs. This indicates that Taylor was able to overcome the doubting processes productively. Moreover, Taylor built multiple pieces of evidence as well as reasoning statements to justify their claims. When Taylor was presented with the different product cards, which were introduced as peer solutions, they experienced a puzzlement regarding THF as the product. Taylor intuitively picked up this card and expressed that this product is confusing. The puzzlement was not connected with a deeper reasoning about the underlying chemical mechanism as only confusion is expressed in Taylor’s utterance. However, before Taylor reasoned further about the possible formation of THF, they took other product cards. Only about ten minutes later, Taylor took the product card THF again and had a moment of understanding as Taylor was able to correctly understand how the product card of the alkoxide was formed, which is the precursor to THF (see Figure 7.2A). In section A, Taylor experienced an understanding because they comprehended how another student came up with THF as a product and this assumed thought process made sense to Taylor. However, in this sequence of the interview, no aspects of argument formation occurred, as Taylor did not make a statement about whether the product seemed plausible or implausible. Moreover, Taylor’s reasoning remained descriptive by stating that the alcohol is deprotonated and attacked by a charged species intramolecularly. Taylor continued by drawing the mechanism of the THF formation. Following the mechanistic description provided, the interviewer prompted if Taylor thought the reaction product is plausible. So far, Taylor experienced puzzlement at first sight but was able to build the mechanism without further prompting by the interviewer which was expressed as understanding. In the following statement, Taylor experienced a turning point, comparable to an eureka moment which is accompanied by doubting and ending with a belief. In the following interview excerpt (see Figure 7.2B), Taylor expressed doubts in the beginning as to whether considering THF as a plausible reaction product is reasonable. Taylor started to reason about the plausibility of the THF formation, which resulted in an ambiguous claim. In this thought process, however, Taylor experienced a turning point when realizing that THF might be a feasible product, finding it increasingly “fascinating”. When asked, Taylor overcame the doubting temporarily because they elaborated on this statement by using chemical concepts (see Figure 7.2B). In doing so, Taylor realized that THF is a plausible product which is expressed as a belief. Moreover, Taylor realized in this moment that the solution of the previous task was incorrect (Taylor’s product of the reaction was a diol in task 2). To further support their statement substantially, Taylor provided the evidence that the formed alkoxide is a nucleophile (see Figure 7.2B). The section is ground-breaking for the reasoning process because Taylor experienced overcoming their doubting by including chemical

Epistemic Stances in Action—Students’ Reasoning Process

117

concepts in their argumentation. This feeling of success helped Taylor in the following argumentation process since they focused on using concepts in their argumentation. When asked why Taylor had not thought of the deprotonation of the hydroxyl group before, even though they had previously talked about hydroxide being a strong base, Taylor went on to say that they did not know whether a one-step process or a ring closure is faster. This open question started a new sequence of doubting for Taylor, in which Taylor struggled to support the one or the other option and did not decide about it. However, through this doubting process Taylor provided another evidence for their claim by stating that a ring closure will not proceed as quickly. Following this, Taylor experienced a belief when stating that now THF would be “a super product”. The interviewer asked why, which Taylor answered by acceptance. The excerpt (see Figure 7.2C) shows that Taylor did not further express why it might be plausible and remained at a level of accepting the THF formation. In this moment, Taylor seems to just express their feelings. Such moments of expressed acceptance without further expressing reasoning are often found in students’ utterance.24 One assumption could be that students are intuitively following or trusting their first thought and then mentally search for some sort of information, which would contradict their feelings. This mental search did not further substantiate Taylor’s feeling in this moment. In this stance of acceptance, Taylor experienced the believing of an idea. With the phrases “I don’t know” and “it just doesn’t seem unlikely to me anymore”, it is not clear if Taylor also understood the idea. Due to their stance of acceptance without further substantiating their claim mechanistically, Taylor then experienced a productive moment. Without prompting, Taylor suddenly formed the evidence that “acid–base reactions are super-fast”. This productive moment, which they initiated themselves, is ground-breaking for the further process, as confirming their previous assumptions about the plausibility of THF and the reaction rate became an important factor for Taylor moving forward in the further weighing process. Taylor is then asked to justify why acid–base reactions are faster, which was answered in a first step by referring to experiences from an analytics lab. When asked again if Taylor could try to reason chemically, Taylor uttered the first reasoning statement and justified the evidence by stating that in an acid–base reaction, two highly charged species react with each other. To support Taylor to overcome their previous doubting processes, the interviewer asked in the next step specifically whether alkoxide or hydroxide is the best nucleophile in this reaction. Up to this point, Taylor has formed arguments exclusively for the claim that THF is a plausible reaction product. However, with the question of which nucleophile is better, Taylor formed first evidence against the formation of THF by stating that the hydroxide is the better nucleophile. After a brief inquiry, Taylor quickly formed two reasoning statements that supported this evidence (hydroxide is more mobile and carbon skeleton must first rotate). However, forming a counterargument did not support Taylor in advancing their final decision, but triggered another doubting, as Taylor contradicted

118

Chapter 7

themselves while weighing both arguments. This moment of doubting, nearly in despair could be felt throughout the room at this moment, because Taylor reflected what was going on mentally (see Figure 7.2D). While Taylor revised their counterargument for THF formation by hydroxide as a better nucleophile, it occurred to them that intramolecular reactions are faster than intermolecular reactions. Taylor also supported this directly with the reasoning statement that there is spatial proximity. During their doubting, Taylor had a productive strategy to face the situation. They did not give up but incorporated more and more conceptual aspects into their problem-solving process to obtain an outcome that was satisfactory to themselves. Thereby, it became clear that Taylor named many influencing factors that affected their decision at the end of the sequence (see Figure 7.2D). Taylor used concept knowledge from a variety of domains and addressed both electronic and kinetic aspects and supports them with, for example, nucleophilicity or statements about spatial arrangement. At the end of this sequence, Taylor started to include energetic aspects in their weighing process. However, Taylor did not reach a satisfactory result for them either because they stated that both reactions have the same energy balance which resulted in staying in their doubting process. Although Taylor did not finally resolve the thoughts about the entropy change in the THF formation, they expressed a stance of belief when considering THF to be a plausible product of the reaction, overcoming their doubting in the end by building evidence and reasoning. To summarize, Taylor experienced many ups and downs during their problem-solving process, characterized by understanding, belief, but also doubting and puzzlement. Taylor’s argumentation process is characterized in sum by five evidence statements and four reasoning statements, which illustrates that many aspects have been incorporated and weighed against each other. At the end, Taylor noted that working on the tasks was an “emotional roller coaster” and that they often come to the point in exams when an initial decision that arose from a gut feeling is questioned when taking a closer look. Taylor’s turning points were often induced through doubting phases and advanced the problem-solving process, as doubting in Taylor’s case led to include and activate more resources in the weighing process. This situation led them to productive moments, resulting for example in an eureka moment. Taylor’s case illustrates that this task put Taylor in the zone of proximal development.25 The task was not too complicated but challenging. Taylor was able to activate additional knowledge resources and to advance the reasoning based on both, their own and the interviewer’s prompts. Although Taylor expressed doubting, this epistemic stance resulted in productive reasoning and illustrates that inducing this epistemic stance of doubting, either through instructional prompts or providing students with conflicting results, allows initiating productive reasoning processes. Low prior knowledge students, in contrast to Taylor, might easily experience frustration, as the relevant concept knowledge to advance their decision-making process might be missing. In those cases, support cards with additional information (e.g., acid–base reactions are fast) might be appropriate in teaching situations to help students incorporate new aspects in their reasoning.

Epistemic Stances in Action—Students’ Reasoning Process

119

7.4.2  Case 2—Robin Robin is a student who was very interested in participating in the interview and research from the beginning. They made a great effort to answer all questions. Robin was polite, curious, reflective, and readily changed their mind when they thought an argument was crucial. However, Robin often relied on intuition. Figure 7.3 shows the interview sequence for the formation of THF. For Robin, justifying the plausibility of THF is closely linked to the precursor alkoxide, so Robin repeatedly switched between the two product cards. When compared to Taylor, Robin experienced a quick change between multiple stances like puzzlement and doubting. Robin supported their claims with several pieces of evidence but with less reasoning statements compared to Taylor. Robin, like Taylor, was presented with the product cards at the beginning of the work phase (Figure 7.1, step 3). Robin immediately took the card with THF, experienced a disbelief, and expressed a claim and evidence. Excerpt A (see Figure 7.3) shows how these two aspects are intertwined and that

Figure 7.3  Chronological  sequence of Robin’s interview sequence, the stances in

action and components of the argumentation while reasoning about THF as a plausible reaction product.

120

Chapter 7

argument components can trigger stances in action and vice versa. These two sentences might be short, nevertheless, they illustrate that epistemic stances can switch quickly. At the beginning Robin expressed a disbelief by excluding THF as a reaction product. Robin stayed on an intuitive level and experienced a puzzlement because the idea of this product card “looks very wrong”. This puzzlement was followed by an evidence statement saying that only a base could form THF (see Figure 7.3A). After a short discussion about terms, Robin was prompted to elaborate on the expressed disbelief that THF is implausible. Robin continued to refer to intuition (“this looks wrong”), though this time Robin did not use a concept to support the statement. From this it was clear that Robin continued to experience a puzzlement and did not (yet) substantiate why they had a disbelief. Compared to Taylor, Robin experienced more puzzlement which is more difficult to overcome because Robin did not express what they were struggling with. In excerpt B, one can notice the uncertainty Robin experienced while working on the task, as they changed from puzzlement, to understanding, and then to doubting (see Figure 7.3). This phase of uncertainty was also accompanied by lower selfconfidence, which also became clear in the personal conversation with Robin. The puzzlement represented a third repetition of the statement that “the molecule looks wrong”, which Robin again justified with intuition, without further referring to chemical concepts to prove the claim. In the second part of the quote (see Figure 7.3B), it also turned out that Robin was uncertain with the previous reasoning, as in this moment Robin correctly described how THF forms and expressed that “it would certainly be possible”. This uncertainty ended in doubting as Robin was confused because of mixed emotions, the possibility of THF forming and their gut feeling. In doubting, however, Robin was more specific and stated more precisely what bothers them: the deprotonation. Robin reinforced this doubting with evidence by referring to the basicity of the hydroxide and the hydroxyl group. In section B, the differences between puzzlement and doubting are noticeable since doubting can be more productive because of using chemical concepts than puzzlement since using concepts not only strengthens arguments but is also a first step to overcome doubting. After two more prompts, Robin still expressed doubting, however, formed more argument components that supported the original disbelief, such as that “you have to use a stronger base” and that the most acidic site would be at a C–H bond, which Robin justified with reasoning by saying “the OH group pulls out electrons and the site is then partially positively charged”. Robin made the last two statements with greater confidence and certainty compared to the previous intuitive statements, briefly overcoming the doubting process. Doubting quickly returned when Robin again expressed that statistically the reaction would work, but one would probably need a stronger base. Since Robin was using concept knowledge and the conversation kept revolving around the same conflict, the interviewer went a step further and talked about the intramolecular reaction step that leads to THF (see Figure 7.3C). When compared to Taylor, Robin needed more prompts

Epistemic Stances in Action—Students’ Reasoning Process

121

to master the argumentation process. Nevertheless, both students were able to build argument components to justify their claims and to overcome their doubting. Robin claimed that the intramolecular reaction step is plausible. When asked “why”, Robin had a reasoned answer using several chemical concepts. In addition, Robin experienced an understanding as they correctly described the mechanism (see Figure 7.3C). In contrast to the previous sequences, Robin felt certain in answering the question. Robin’s moment of understanding regarding the mechanism was supported by the evidence that (1) intramolecular reactions are faster, (2) chloride is a good leaving group, and (3) that the alkoxide is a nucleophile (see Figure 7.3C). When asked if the alkoxide or the hydroxide is the better nucleophile, Robin hesitated briefly but quickly settled on the fact that it would be the alkoxide, further strengthening the previously stated belief substantially. The interviewer asked again if Robin thought the overall reaction leading to THF is plausible. The interplay of the disbelief for the alkoxide and the belief for THF created doubting in Robin as the following claims represented: Robin alternated between “it is possible”, “it is unlikely”, and “theoretically possible” within a few words. In the last part of the task sequence, Robin was asked if they wanted to stay with their own product from the second task or revise it and was asked why they thought the nucleophilic attack is preferred compared to the basic attack. This question seems to trigger a turning point for Robin as they mentioned evidence that ultimately led them to consider THF as the most plausible product of the reaction (see Figure 7.3D). Even though, contrary to Robin’s claim, the interviewer did not use the term acid–base reaction, the question seemed to have triggered something. Robin was able to activate knowledge about the kinetics of acid–base reactions, which was the crucial point for Robin to overcome the previous doubting. It became clear that Robin, due to this prompting, experienced a belief, which they substantiated with evidence. The task of making a reasoned decision about whether THF is a plausible reaction product was noticeably difficult. Robin experienced many doubting, puzzlement, as well as moments of understanding during this decision process, which was evident through the alternation of belief and disbelief. Ultimately, Robin was able to overcome their puzzlement by incorporating chemical concepts into the explanation and not just relied on their gut feeling. However, compared to Taylor, Robin often needed prompts for overcoming stances while Taylor experienced turning points by themselves. While the incorporation of chemical concepts often ended in doubting, it was productive as Robin used more and more concept knowledge, gained confidence, and overcame doubting through a turning point. Robin also provided a rationale in the reflection for why they initially relied on intuition. Robin stated at the end, that “it was more automatic” and that they paid attention to whether they had “seen reaction or reactant before”. Robin expressed to “think less about what happens and how and shoot more from the hip”. This expressed reliance on recognition, neglecting analytical thinking to solve problems is a well-known phenomenon in research.26

122

Chapter 7

7.5  Conclusion and Implications These two cases described therein illustrate the interplay of epistemic stances and argumentation in the reasoning process of two chemistry major students. First, we looked at the connection of epistemic stances (derived from Hammer’s and Elby22) and argument components using a simplified version of Toulmin’s argumentation model.15 Second, we focused on moments that illustrated how epistemic stances in action influenced students’ turning points in their argumentation process. Particularly with the focus on epistemic stances, it was noticeable that a doubting process, often leads to a more intensive examination of the content and the use of more argument components. This observation calls us as educators to reflect about how moments of puzzlement, disbelief or cognitive conflicts27 can engage deeper reasoning. Doubting should therefore not be perceived as something negative but supportive for learning. In the case of puzzlement, the students seem to struggle to understand for themselves what their thought processes are to overcome puzzlement, which made it difficult to activate chemical content. Turning points could arise as a reaction to a prompt (see Robin) but could also be induced by the students themselves (see Taylor). This often initially led to overcoming puzzlement or doubting and caused belief or disbelief. However, newly activated knowledge emerging in a turning point, could as well lead to stances like puzzlement or doubting. Both students experienced various stances and different turning points throughout this interview sequence and were additionally guided by an interviewer. And although the reaction seems to be simple, both students experienced this problem-solving process differently. However, in both cases, epistemic stances such as puzzlement or doubting led to a more intense reasoning process and advanced students’ decision-making process. These cases thus illustrate how these moments can lead to a higher engagement and a productive activation of additional conceptual resources to make a final claim. However, the variance in both cases and the explicit guidance possible in an interview tells educators to pay more attention to the individual differences of how students are experiencing these moments and what they need to overcome their struggles. This case study can only shed light on two single experiences of two students of an advanced organic chemistry course, so implications for a general population of students are hard to draw. The cases illustrate as well, that if the interviewer did not prompt further without telling the solution, the reasoning process would have stopped earlier before the students would have had the chance to reason further and solve the task. This reminds us of creating learning opportunities in which students are not directly provided with the correct or the complete solution in case of struggle. Students might be able to recall and use their conceptual knowledge on their own. Taylor’s turning point is indicative for this. Presenting students peer-solutions as shown here with alternative products or videos of peers explaining problem-solutions28 might initiate a doubting and reflection process, instead of providing students with correct expert-answers and prompt them to explain those. However, in a teaching situation,

Epistemic Stances in Action—Students’ Reasoning Process

123

students need additional time and structured prompting to come up with additional hypotheses and arguments when reflecting about reaction mechanisms. Some students with low prior knowledge might need “thinking bites” in terms of support cards that prompt them, compared to the interviewer prompts, to incorporate aspects they have not yet thought about. Considering students’ reasoning processes with this twofold perspective on students’ epistemic stances and argumentation processes presented here allows a more holistic description of students’ experience. Further research is required to elicit the interplay between epistemic stances and conceptual understanding and what we can learn from it for future teaching and learning. The case study here shows that an in-depth analysis with a process-oriented lens on students’ approaches can shed light on aspects that we might overlook when evaluating students’ performance and outcome of their problem-solving.

Acknowledgements We thank the students who participated in the study and the members of the Graulich research group for fruitful discussions. Leonie Lieber would like to thank the Verband der Chemischen Industrie (German Chemical Industry Association) for supporting her with the Kekulé Fellowship.

References 1. D. Hammer, A. Elby, R. E. Scherr and E. F. Redish, in Transfer of Learning from a Modern Multidisciplinary Perspective, 2005, pp. 89–119. 2. I. Caspari and N. Graulich, Scaffolding the structure of organic chemistry students’ multivariate comparative mechanistic reasoning, Int. J. Phys. Chem. Educ., 2019, 11, 31–43. 3. A. Chakravartty, A puzzle about voluntarism about rational epistemic stances, Synthese, 2011, 178, 37–48. 4. A.-L. Fayard, E. Gkeredakis and N. Levina, Framing innovation opportunities while staying committed to an organizational epistemic stance, Inf. Syst. Res., 2016, 27, 302–323. 5. J.-M. G. Rodriguez, K. Bain, N. P. Hux and M. H. Towns, Productive features of problem solving in chemical kinetics: More than just algorithmic manipulation of variables, Chem. Educ. Res. Pract., 2019, 20, 175–186. 6. R. M. Kelly, S. Akaygun, S. J. Hansen, A. Villalta-Cerdas and J. Adam, Examining Learning of Atomic Level Ideas About Precipitation Reactions with a Resources Framework, Chem. Educ. Res. Pract., 2021, 22, 886–904. 7. D. Hammer, Discovery learning and discovery teaching, Cogn. Instr., 1997, 15, 485–529. 8. D. Hammer and A. Elby, Tapping epistemological resources for learning physics, J. Learn. Sci, 2003, 12, 53–90. 9. J. F. Osborne and A. Patterson, Scientific argument and explanation: A necessary distinction? Sci. Educ., 2011, 95, 627–638.

124

Chapter 7

10. M. M. Cooper, H. Kouyourndjian and S. M. Underwood, Investigating Students’ Reasoning about Acid-Base Reactions, J. Chem. Educ., 2016, 93, 1703–1712. 11. K. L. McNeill, D. J. Lizotte, J. Krajcik and R. W. Marx, Supporting students’ construction of scientific explanations by fading scaffolds in instructional materials, J. Learn. Sci, 2006, 15, 153–191. 12. R. Ferguson and G. M. Bodner, Making sense of the arrow-pushing formalism among chemistry majors enrolled in organic chemistry, Chem. Educ. Res. Pract., 2008, 9, 102–113. 13. A. Kraft, A. M. Strickland and G. Bhattacharyya, Reasonable reasoning: multi-variate problem-solving in organic chemistry, Chem. Educ. Res. Pract., 2010, 11, 281–292. 14. M. H. Towns, R. S. Cole, A. C. Moon and C. Stanford, in Argumentation in Chemistry Education, Royal Society of Chemistry, 2019, pp. 247–274. 15. L. Lieber and N. Graulich, Investigating Students’ Argumentation when Judging the Plausibility of Alternative Reaction Pathways in Organic Chemistry, Chem. Educ. Res. Pract., 2022, 23, 38–54. 16. S. E. Toulmin, The Uses of Argument, Cambridge University Press, 2003. 17. M. Ratcliffe, Stance, feeling and phenomenology, Synthese, 2011, 178, 121–130. 18. B. C. Van Fraassen, The Empirical Stance, Yale University Press, 2008. 19. L. Louca, A. Elby, D. Hammer and T. Kagey, Epistemological resources: Applying a new epistemological framework to science instruction, Educ. Psychol., 2004, 39, 57–68. 20. B. K. Hofer and P. R. Pintrich, The Development of Epistemological Theories: Beliefs About Knowledge and Knowing and Their Relation to Learning, Rev. Educ. Res., 2016, 67, 88–140. 21. B. K. Hofer, Personal epistemology research: Implications for learning and teaching, Educ. Psychol., 2001, 13, 353–383. 22. D. Hammer and A. Elby, in Personal Epistemology: The Psychology of Beliefs about Knowledge and Knowing, 2002, pp. 169–190. 23. L. Lieber and N. Graulich, Thinking in Alternatives-A Task Design for Challenging Students’ Problem-Solving Approaches in Organic Chemistry, J. Chem. Educ., 2020, 97, 3731–3738. 24. V. Talanquer, Chemistry Education: Ten Heuristics To Tame, J. Chem. Educ., 2014, 91, 1091–1097. 25. L. S. Vygotsky, Mind in Society: The Development of Higher Psychological Processes, Harvard University Press, 1980. 26. N. P. Grove and S. L. Bretz, A continuum of learning: from rote memorization to meaningful learning in organic chemistry, Chem. Educ. Res. Pract., 2012, 13, 201–208. 27. K. J. Linenberger and S. L. Bretz, Generating cognitive dissonance in student interviews through multiple representations, Chem. Educ. Res. Pract., 2012, 13, 172–178. 28. N. Graulich, A. Langner, K. Vo and E. Yuriev, in Problems and Problem Solving in Chemistry Education: Analysing Data, Looking for Patterns and Making Deductions, ed. G. Tsaparlis, Royal Society of Chemistry, London, 2021, pp. 38–67.

Chapter 8

How Do Students Reason When They Have to Describe the “What” and “Why” of a Given Reaction Mechanism?† JOLANDA Hermanns*a and DAVID Kellera a

Zentrum für Lehrerbildung und Bildungsforschung und Institut für Chemie der Universität Potsdam, Karl-Liebknechtstraße 24–25, 14476 Potsdam, Germany *E-mail: [email protected]

8.1  Introduction Designing reaction mechanisms is one of the main topics in the field of organic chemistry. To design reaction mechanisms, students need several basic competences: the confident use of the formulaic language (including the use of electron pushing arrows), knowledge of the bonding of atoms and subsequently their partial charges, the application of concepts, especially the concept of nucleophiles and electrophiles, and the basic principles of designing reaction mechanisms, including writing down each individual step of the mechanism.1 As a possible learning opportunity for students †

Electronic supplementary information (ESI) available: See Tables 8.1–8.3. See DOI: 10.1039/9781839167782

  Advances in Chemistry Education Series No. 10 Student Reasoning in Organic Chemistry: Research Advances and Evidence-based Instructional Practices Edited by Nicole Graulich and Ginger Shultz © The Royal Society of Chemistry 2023 Published by the Royal Society of Chemistry, www.rsc.org

125

126

Chapter 8

aiming to design reaction mechanisms independently, mechanism comics were designed and used in a course on organic chemistry. The students’ creations were also used to diagnose the quality of their mechanistic reasoning.

8.2  T  heoretical Background—Mechanistic Reasoning and Writing-to-learn in Organic Chemistry “Does mechanistic thinking improve student success in organic chemistry?” This question by Grove, Cooper and Rush2 would certainly be answered in the positive by most organic chemists. Most students do not engage in mechanistic thinking because they either do not find it necessary or are not capable of doing so. Mostly, they try to simply memorize as much content as possible on organic chemistry.3 This reliance on rote memorization, however, prevents meaningful learning. To foster more meaningful learning in organic chemistry, Crandell et al.4 suggested emphasizing logical reasoning about reactions in general chemistry courses. Causal mechanistic explanations should then include explanations for the reaction and not only a description. In recent years, research has examined students’ mechanistic reasoning. Sevian and Talanquer5 defined several modes of reasoning describing different levels of complexity in student reasoning: descriptive, relational, linear causal and multi-component. Asking students to explain mechanisms in their own words is one way to begin to elicit student thinking in terms of symbolism.6 In an interview study, Galloway, Stoyanovich and Flynn7 evaluated students’ interpretations of mechanistic language in organic chemistry before the students learned the respective reactions. They showed that there is a need to teach students how to approach problems and use metacognitive monitoring techniques. Further, several researchers have examined interviews to evaluate students’ mechanistic reasoning.8,9 To support students’ reasoning, several scaffolds for use in interview studies have been designed and evaluated.10,11 Writing-to-learn (WTL) strategies can enhance knowledge acquisition and cognitive skill development in science disciplines.12 Writing is used “to improve student understanding of content, concepts, and the scientific method”.13 It is therefore no surprise that WTL strategies for teaching chemistry have been developed, used and evaluated.14–17 Gupte et al.17 showed that WTL assignments encourage students to build connections between new concepts and existing knowledge and that WTL assignments provide students with opportunities for meaningful learning.

8.3  Research Questions In our study, the students had to write captions for each depicted step of a given reaction mechanism. This can be seen as the application of a WTL strategy, because in writing out the captions they had to apply their knowledge of

How Do Students Reason When They Have to Describe the “What” and “Why”

127

chemical formulas, bonding, and concepts such as the acid-base concept or the concept of electrophilicity and nucleophilicity. To assess the quality of students’ reasoning, the following research questions should be answered. If the quality is insufficient, the task would be not suitable as a learning opportunity for students aiming to design reaction mechanisms.    RQ1: What is the quality of students’ reasoning regarding their description of the “what” of the given reaction mechanism? RQ2: What is the quality of students’ reasoning regarding their description of the “why” of the given reaction mechanism?

8.4  Methods 8.4.1  The Course “Training OC” The course “Training OC” was conducted in the Winter term 2019/20 as a departmental course (2 h per week) for all students who had to attend courses on organic chemistry during their bachelor studies.1 In the course, around 30 students attended each week. Almost all students were preservice chemistry teachers in their second undergraduate year. The goal of this additional course was to train basic competences in organic chemistry and their application. The main goal of the course was to enable students to solve reaction mechanisms independently, especially reactions unknown to them. As an element in training students to understand reaction mechanisms, comics were created depicting the complete reaction mechanisms and used in the lessons. Because these comics are the focus of our study, they will be described in more detail below.

8.4.2  Sample For the third topic of the course (“reaction mechanisms”), two reaction mechanisms were broken down into their individual steps: three steps for each mechanism (for an example, see Figure 8.1). The students were asked to describe and explain each step in the reaction by writing captions. The wording of the task was as follows: “Two reaction mechanisms are given. Write beneath the boxes what happens and why.” After completing the tasks, the students’ products (N = 26) were collected and evaluated. The goals of this exercise were explained, namely of providing the students with feedback and evaluating their skills during the course, as well as the importance of the results for our research. Anonymity was guaranteed; all students gave their consent for their products to be used. Excerpts appearing in this publication were translated from German to English.

8.4.3  The Coding Process The coding of the students’ products was conducted inductively and deductively (only for round 2) in three rounds by both authors (see Table 8.1 online).

Chapter 8

128

Figure 8.1  The  mechanism comics with a representative example of one student's captions.

In the first round, the written captions were classified as ‘descriptive’, ‘descriptive-causal’, or ‘causal’. The coding scheme for the second round of coding included two categories: ‘activities’ and ‘properties of entities’ as described by Watts et al.18 as an adaption of the coding scheme proposed by Russ et al.19 The category ‘properties of entities’ consisted of the codes “acid– base,” “nucleophile–electrophile” and “charge”. The category “activities of entities” consisted of “explicit electron movement” (including “attack of the free electron pair”), “implicit electron movement” (including “attack-split off”, “protonate–deprotonate” and “mesomerism”) and “changes in bonding”. Due to the small portions of text beneath the steps of the reaction mechanisms, those two codes were sufficient to answer our research question. As discussed by Watts et al.,18 the entities were not coded separately, because the students always described the properties of the entities’ activities. After the coding process was finished, we investigated how many students used the different codes at least once. The classification16 was adapted to ensure a good fit with our data. Before analyzing the data in detail, the percentage of responses for each category were determined to answer the question of whether the students’ used responses that were coded by the respective code. In the third round, the students’ descriptions and explanations were analyzed to decide whether they were ‘technically correct’, ‘partially correct’, or ‘incorrect’. Students’ products were coded as partially correct if only one part was incorrect, such as with the use of “hydrogen” instead of “proton”.

How Do Students Reason When They Have to Describe the “What” and “Why”

129

The explanations of the “why” of the reaction were inductively coded and the respective codes assigned by both coders. The results from all three rounds were then used to analyze the “what” (descriptive) and “why” (causal or descriptive-causal) descriptions of the students in more detail to assess the quality of students’ reasoning, because to “comprehend and explain in clear language fundamental science concepts is essential for students’ knowledge of chemistry” (for examples, see Table 8.2 and 8.3 online). Therefore, “it is important for students to learn the language of chemistry with its technical terms, formulae, and patterns of argumentation”.20 For all three rounds, both coders discussed and compared their classifications until 100% inter-rater agreement was reached.21

8.5  Results and Discussion 8.5.1  R  Q1: What is the Quality of Students’ Reasoning Regarding Their Description of the “What” of the Given Reaction Mechanism? Before analyzing the “what” descriptions in detail, the students’ reasoning was first assigned to the categories ‘descriptive’ (the students’ product includes only a description without an explanation), ‘descriptive-causal’ (the students’ product includes both descriptive and causal explanations) or ‘causal’ (the students’ product consists only of causal explanations). After coding the students’ explanations for both mechanisms, eight explanations emerged as completely descriptive. 18 (mechanism 1) and 17 (mechanism 2) were descriptive-causal and only one (mechanism 2) was defined as completely causal. For 17 of the students (65%), the classifications of their reasoning were the same for both mechanisms. Almost all students used features that were assigned to the codes “acid– base” (96%), “nucleophile–electrophile” (100%) and “charge” (96%). All students therefore identified properties belonging to the given reaction mechanism, as both mechanisms included acid–base as well as nucleophile– electrophile interactions. The high percentage for the code “charge” indicates that the students used charges of entities as an explanation tool for their mechanistic reasoning. Basic principles such as the attraction of differently charged ions seem to be known and applicable. As shown in Figure 8.2, all students used features that were assigned to the code “attacks-split off” (“implicit electron movement”) and the category “changes in bonding”. This means that the interaction of two particles and the changes in bonding are both linked to the reaction mechanisms, which indicates that the students have basic knowledge of what occurs during a chemical reaction. However, in their descriptions the students seem to favor the more general use of the wording “attack” compared to specialist terms like “protonate–deprotonate”, which was used by only by 38% of the students. The electron movement occurs with mesomerism was coded only for 19% of the students.

Chapter 8

130

Figure 8.2  Students'  features for the category “activities of entities”. Color codes: explicit electron movement (dark blue), implicit electron movement (light blue), changes in bonding (grey).

Most students (85%) cited explicit electron movement in their descriptions and explanations. 76% of the students based their arguments at least once on the attack of the free electron pair for a particular step in the reaction mechanisms. Considering that the seminar sessions before this study took place focused especially on determining partially charged atoms, adding free electron pairs, and writing down the first step of a reaction mechanism while using electron-pushing arrows, this percentage is not satisfactory. It seems that not all students had internalized those basic concepts in a way that allowed them to apply them during their writing, or else they judged the application of the concepts as irrelevant for the task at hand. For both codes, the students’ use of all features for their mechanistic reasoning will be discussed below in detail.

8.5.1.1 Properties of Entities All students used features that were coded as belonging to the concept of nucleophile–electrophile, because it was emphasized in the sessions they attended before creating the mechanism comic. The students named particles involved in the reaction mechanism as either “nucleophile” or “electrophile.” Sometimes a definition was also given, as with the example “the negatively charged nucleophile.” Although not all nucleophiles are negatively charged, it shows that one property of nucleophiles was known to the students who defined the nucleophile in this way; the students used for their argumentation only a surface feature of nucleophiles.22 The explanation “the base has a free electron pair and therefore reacts as a nucleophile” was used more

How Do Students Reason When They Have to Describe the “What” and “Why”

131

often in the students’ writing, for example in the first step of mechanism 1, which shows that the students who used this description were arguing on the electron level of a reaction mechanism and used electron movement for their line of reasoning. Almost all students emphasized charges in their arguments (96%). Words such as “negative”, “positive”, “neutral” or “partially positive or negative” were used and sometimes also explained, for example in step 1 of the second mechanism: “3-bonded oxygen and therefore positive”. Although this explanation is not completely correct, it shows that the student has some idea that the bonding of an atom is responsible for being either charged or neutral. This is also supported by the following description: “now C has three bonds and is positive”. An explanation where one can assume that the octet rule was applied in some way can also be found: “because of this the O has a positive charge, because it has 3 bonds and only 1 free electron pair”. One student did apply the octet rule to determine whether there was a charge, but used the wording “atoms” instead of “electrons”: “6–5 atoms an electron is missing; positive charge”. Another student also argued via the octet rule and wrote: “because of this O has only 5 electrons positive charge”. A less favourable explanation for the same mechanistic step was: “the O-atom of the carbonyl group is protonated; therefore a positive charge at the O-atom”. A very similar description was given for the third step of the second mechanism: “The proton is split off. Because of this the positive charge of the Nu disappears”. Here, the positive charge is treated like some sort of particle that can be transferred from one atom to another. Although the idea that a positively charged atom added to a neutral molecule will form a particle that is also positively charged is understandable, and in principle not completely wrong, it does not explain why the O atom is positively charged. The focus should therefore always be on the bonding of each atom. The students also identified acids and bases, which is not surprising because those words are part of the formula scheme. The acidic or basic environment was not described. The students only used the words “acid” and “base.” The properties of the two compound groups were assigned correctly, as the following citations show: “from the acid H+ is added to the O”, or “acid is a proton donator”. Here, the Brønsted theory, namely that an acid is a compound in which a proton was split off, was used correctly to describe the property of the acid. Another student, also using the Brønsted theory, wrote: “compound is deprotonated by the base. The base picks up the H+”. Although these are only two citations, the analysis shows that some of the students argued based on the Brønsted theory of acids and bases as has been discussed by Cartrette and Mayo.23 This is not surprising, because this is a theory they know from secondary school as well as from courses on general chemistry. The students made use of the appropriate entities and their properties. They also gave correct or nearly correct explanations for the properties. Whether this led to a correct argumentation regarding the entities’ activities will be analyzed in the following section.

132

Chapter 8

8.5.1.2 Activities of Entities All students described electron movement: all implicit and 85% explicit. All of them also described changes in bonding. Students’ descriptions of implicit electron movement and the assignment of the codes will be analyzed first. Descriptions coded with “attack-split off” were found in all students’ writings. The word “attack” was used mostly to describe the reaction of the nucleophile (in both mechanisms, the second step) with the carbonyl compound. This step, in which a nucleophile reacts with an electrophile, was also often called a “nucleophilic attack”, a description commonly used in lectures and chemistry books. Another reason for the terms “nucleophile attacks” and “nucleophilic attack” is most certainly the way this step is illustrated in the mechanism diagram. An electron-pushing arrow leads from the nucleophile to the carbonyl C atom. This direction indicates some sort of movement towards the C atom, and this movement is described as an “attack” because this is the vocabulary also used by many experts. None of the students described this step in the reaction as the movement from each particle towards the other, which would be the correct way to describe the interaction between two particles. However, from the students’ writing alone it is not possible to deduce whether they are convinced that the movement is only from the nucleophile to the C atom or whether they know that both particles are attracted to each other but “only” use language that is commonly used and that they therefore assess as suitable to describe this step in the reaction mechanism. Summarizing, the activity of the “nucleophile” entity was described in the correct (or usual) way. It remains worthy of discussion whether the vocabulary commonly used, such as the word “attack,” is suitable for students starting to learn organic chemistry. Most certainly the purpose of the electron-pushing arrow must be discussed more explicitly and transparently to ensure that no misconceptions around the basic principles of chemical reactions are supported. The code “protonate–deprotonate” was assigned less often, only for 38% of the students. The activity of splitting off or adding protons is mostly described in words other than the technical terms “protonate–deprotonate”, like for example “base takes H+ from acid”. In total, 17 students did not use the terms at all, four students sometimes used them, although not for all possible steps, and five students argued only using the technical terms “protonate–deprotonate” and argued therefore based on the Brønsted concept. Although the activities (adding and splitting off protons) were in general described correctly, most of the students used different language to describe these activities. Some of the descriptions also rely on the Brønsted concept, but do not include the terms “protonate–deprotonate”, only a description such as “the base takes the proton”. For all other students, it must be investigated which concepts they used if at all. The application of the Lewis acidbase concept would be preferable, because students employing the Brønsted concept only focus on surface-level features to identify acids and bases and to describe their activities.24 If the students understand proton transfer as

How Do Students Reason When They Have to Describe the “What” and “Why”

133

an interaction between an electron pair and a proton, acid-base reactions could be a foundation for other reactions.24 The captions of the 17 students who also used other terms to describe the acid–base reactions were therefore analyzed with a focus on their application of the Lewis acid–base concept. All students argued at least once in reference to the attack of the free electron pair from the base towards the proton and therefore employed the Lewis acid–base concept. The results of this application will be discussed in more detail in the section on explicit electron movement. Only five students (19%) wrote descriptions receiving a “mesomerism” coding. The use of this term showed that the students had some misconceptions around this term, as the following citations show: “because of the mesomeric formula it is possible to protonate this group”, “mesomerism causes the double bond to shift to O” or even “because of the mesomer where the double bond is broken, a stable carbenium ion is formed”. The students did not grasp the concept of mesomerism in the sense of several resonance structures for one molecule, none of these resonance structures representing the actual structure of the molecule. They described “mesomer” as a separate compound. Because all students’ written captions included parts that were coded with “changes in bonding”, these were clustered for the discussion into two sub-codes. The sub-code “bonds are formed” was assigned 21 times and the sub-code “bonds are broken” 55 times. A typical example for the first subcode is “the free electron pair from O builds a new bond to H”. The bond breaking focuses mostly on the step in which the double bond between C and O breaks: “The double bond becomes a single bond” or “the double bond to O is dissolved”. In their reasoning, the students seem to focus more on the bond breaking than on the bonding. One possible explanation could be that, for the breaking of the bond between C and O, they concentrate on applying the rule that the C atom can only have four bonds. The change in bonding is seen right away; the double bond is replaced by a single bond. Explicit electron movement was coded generally as “explicit electron movement” and more specifically as “free electron pair attacks.” Only two students’ descriptions of explicit electron movement did not describe the attack of a free electron pair. Both described the second step in the first mechanism comic; the bound electron pair shifts towards the O atom and becomes a free electron pair at this atom. All other students described the attack of a free electron pair for two different steps in the reaction mechanism. The first step featured the attack of the base with its free electron pair: “the free electron pair of the base attacks the hydrogen atom” and “the negatively charged oxygen atom attacks the hydrogen atom of the acid”. The students argued here using the Lewis concept of acids and bases. Therefore, they did not only describe “what” was happening (“the base attacks the hydrogen atom”), but also “how” (“the free electron pair of the base attacks”). This approach was trained at the beginning of the course because the use of electron-pushing pairs was introduced for the repetition of acid–base reactions which the students should know from their courses on general and inorganic chemistry. The suggestion to let students start training their mechanistic reasoning in

Chapter 8

134 4

general chemistry as proposed by Crandell et al. was therefore applied at the beginning of our course. The second step featured the attack by the free electron pair of the nucleophile on the C=O double bond: “nucleophile with free electron pair attacks the δ+ carbon”. Here, the movement of the electrons was also described explicitly. The concept of nucleophiles and electrophiles was already known to the students from earlier in the course, and they had some training in writing down reaction steps by applying electron-pushing arrows for this sort of reaction. 76% of the students seem to have had already internalized this concept, as their reasoning shows. In the third round of the coding process, all descriptions were analyzed for technical correctness. In total, 108 descriptions were analyzed: 8 (7%) were technically incorrect, such as “acid takes H+ from base”. Here, the terms ‘acid’ and ‘base’ were mixed up, regardless of the correct drawing of the reaction step (third step in comic 1). 20 (19%) were only partially technically correct. Generally, the students did not distinguish between atoms and ions and often wrote an H atom instead of a H+ or proton, as in: “acid minus H is base”. Instead of writing down ‘free electron pair’ it seemed that the students named only one electron: “free e− attacks H+”. It may also be that the students used “e−” as an abbreviation for “electron pair.” Sometimes, the students mixed up the technical terms ‘atoms’ and ‘electrons,’ as in: “O now has 6 atoms and is therefore without charge”. However, the majority of the “what” descriptions (80; 74%) were technically correct. To summarize, the quality of students’ reasoning regarding the question of “what” was happening in the reaction steps was generally appropriate and technically correct. The students particularly utilized their knowledge of the concept of nucleophiles and electrophiles that they learned in prior sessions of the course. Students who argued via the Lewis concept on acid and bases, and therefore described electron movement explicitly, not only described “what” happened in the reaction mechanism, but also “how”. The question whether the students also explained “why” the mechanistic steps occurred as they did still remain, however, and will be discussed in detail by answering RQ2.

8.5.2  R  Q2: What is the Quality of Students’ Reasoning Regarding Their Description of the “Why” of the Given Reaction Mechanism? To answer RQ2, the written captions from 18 students (69%) assigned to the category ‘descriptive-causal’ or ‘causal’ were analyzed in detail. For a detailed evaluation of the students’ explanations, they were inductively clustered using the method of qualitative content analysis25 into the sub-categories “charges,” “bonding” and “concepts.” The sub-category “concepts” was then divided into “Brønsted acid–base concept” and “concept of nucleophiles and electrophiles.” Figure 8.3 shows the percentage of students’ explanations that were coded for the four sub-categories. All students’ explanations were

How Do Students Reason When They Have to Describe the “What” and “Why”

135

Figure 8.3  Percentage  of students' explanations in the four sub-categories. also classified as being ‘technically correct’, ‘partially correct’, or ‘incorrect’. Only 8 (12%) explanations were incorrect, 25 (37%) were partially correct and the majority (35; 51%) were technically correct.

8.5.2.1 Charges The code “charges” was assigned most often (41% of the codes). The reasons for “why” the charges changed for the atoms that were part of the reaction mechanism’s steps were mostly discussed either correctly or at least partially correctly. Only one explanation was technically incorrect. If the students argued correctly, they used the total sum of the valence electrons at the respective atom to explain the charge, as in: “One electron pair is shifted towards the oxygen. Therefore, O has seven electrons and is negatively charged”. If the answer was coded as “partially correct”, the students argued by means of the shift of electrons or protons and not the sum of valence electrons: “the proton is split off from the nucleophile; because of this it receives a negative charge” or “the bound electrons shift to the nucleophiles, which therefore receive a negative charge”. In this last example, it remains unclear whether the student had at least kept in mind the total sum of the valence atoms. Regardless, it was not written down explicitly. It can therefore also be assumed that the student thought that an atom always receives a negative charge if electrons shift towards it. Whether the student had checked the total sum of valence electrons remains unclear without a written explanation. In their explanations, the students focus on the charges of the particles involved in the reaction steps and therefore utilize surface features.26 Several reasons for this observation are possible. Students may have struggled to determine the charges, although it is equally possible that they may not have prioritized identifying the charges during the construction or as understood

136

Chapter 8

here, in the explanation of “what” occurred in the various steps of the reaction. However, the significance of correct charges in a reaction mechanism do seem to be clear to the students.

8.5.2.2 Bonding The changes in bonding that occur in the given reaction mechanisms are explained less often (35%). In the explanations that were incorrect (30% of the explanations), several misconceptions can be observed. The students argued that the bond was formed to either stabilize the particle (“the negatively charged O− grabs a H+ from the acid to stabilize itself”) or to neutralize the charged atom, therefore fulfilling the octet rule (“the O atom needs to give one electron to fulfill the octet rule and therefore takes a H+ from an acid”). This shows that those students did not grasp either the concept of stabilization or the octet rule. In addition, the concept of “mesomerism” and the reasons for disbanding bonds were not clear to any of the students, as the following citations show: “the mesomerism causes the shift of the double bond towards the O” and “the base attacks with its free electron pair the H+ and because of this the bound electrons shift towards the Nu”. However, 70% of the explanations were either fully or partially correct. Those explanations mostly included the maximum sum of bonds an atom can have: “H cannot have two bonds therefore the Nu− is split off” or “the double bond becomes a single bond, because C can only have 4 bonds”. Only in one-third of the explanations of “why” a specific step occurred did the students argue “why” bonds were formed or disbanded, although changes in bonding are the central aspect of chemical reactions in general and especially of reaction mechanisms. However, this importance could also be the reason why the students explained changes in bonding less often. They had perhaps internalized this concept so much that explanations seemed unnecessary.

8.5.2.3 Brønsted The descriptions explaining “why” steps occurred in reaction mechanisms included references to both the Brønsted and Lewis concepts of acids and bases. For the two concepts, only the explanations that explicitly included the Brønsted concept were coded. Parts where the Lewis concept could be identified as the theoretical basis were coded with “nucleophile–electrophile” because the argumentation fit better with the paramount concept, such as in “attack of the base to H+, because H is δ+ and the base δ−”. The idea that particles interact because one is partially negative and the other partially positively charged is one of the basic ideas of the concept of nucleophiles and electrophiles. Only 8 explanations (10%) were coded with Brønsted. This was a very small number compared to the descriptions of “what” happened in the reaction steps. It can be assumed that, for the students, the statement that a particle protonates or deprotonates is sufficient also for the “why”,

How Do Students Reason When They Have to Describe the “What” and “Why”

137

because they have sufficient knowledge of the concept and write their captions from an expert point-of-view. In this case, there would obviously be no need for them to discuss this at length. All descriptions of the “why” for this sub-category were coded as being partially correct, because they were more a definition than an explanation, as with “strong base grabs H+ and thus the base becomes acid” or “acid is proton donator. Thus, the acid becomes a base, because it can now accept H+ again”. Although the words “thus”, “because”, etc., are included in the argumentations and were therefore coded as “causal”, they are not really explanations. This would support the assumption that the students did not see the need to explain the steps that included acid–base reactions following the Brønsted concept.

8.5.2.4 Nucleophile–Electrophile The code for the sub-category “nucleophile–electrophile” was assigned for 14% of the explanations, which were all partially correct. All explanations included those regarding the interactions between differently charged particles or between free electrons and partially positively charged atoms, as with “nucleophilic attack on the carbonyl C, because this is partially positive”. What is lacking in this explanation is that differently charged particles attract each other. This means that the basic idea of this concept is quite familiar to these students. Although the words “thus”, “because”, etc., have been used, these explanations are only understandable for experts. This is comparable to the descriptions of the electron movement in the “what” descriptions. The students explicitly identified electron movement and therefore described “how” the reaction step occurred. It is possible that the students, from an expert point-of-view, assessed their description of the “how” as already sufficient for the “what”. To summarize, the quality of students’ reasoning regarding the question “why” the reaction steps occurred as they did remain mostly on a descriptive-causal level. Even the parts coded as “causal” describe more “how” (“the free electron pair of the base attacks”) a step occurred compared to “why” (“the nucleophile attacks…”), whereas those explanations may be assessed as sufficient by experts. However, the students were able to apply the concept of nucleophiles and electrophiles in their reasoning, which is a positive learning outcome of the course, because there was a clear focus on this concept and its application to new problems. However, to act as a learning opportunity and a pre-requisite for designing reaction mechanisms, the task should be revised (see Section 8.7).

8.6  Limitations The students’ participation in the course “Training OC” was voluntary. However, 26 students participated, which is an acceptable group size for a qualitative study. For the assessment of the students’ products, it should be

138

Chapter 8

considered that the students are still in the middle of their learning process and that the task of writing captions is relatively uncommon and therefore quite unusual for the students. However, observations during the course showed that all students were motivated to learn the content of this course; the participation during the activities (e.g., learning games) was very good. The proficiency level varied. Both very good students and good-to-average students participated. However, all students can be seen as beginners. The task of writing comic captions was also unfamiliar; all students had written these descriptions of reaction mechanisms for the first time, which could have influenced the results if the writing itself was the problem and not (or not only) the content.

8.7  Implications The results of this study have implications for teaching and for further research in this field. For instructors, it can be recommended to divide the task of writing captions for a given reaction mechanism into three parts: writing down “what” happens in the mechanistic steps, “how” this happens and “why”. This is because the results of our study showed that the students mostly wrote “what” and “how” the various steps occurred, but only rarely “why”. In the future, before the students begin the task, an example including all three subtasks should be given for another reaction mechanism and discussed with the students. To further support the students and encourage them to explain the reaction steps, lists of words and verbs that they can (or should) use for their captions, such as technical terms (e.g., “nucleophile” or “attack”) and words for causality (e.g., “because” or “due to”), can be added to the task as aids to help students complete the task. In a further development of this task, the students can be asked to write both the steps and the captions for a previously unknown mechanism. As a prerequisite for the task, students should be able to apply electron-pushing formalism.27,28 Subsequent analyses could then compare the students’ proposed mechanisms with captions and proposed mechanisms without captions to determine whether writing captions supports students with constructing reaction mechanisms. The evaluation of the captions can also be used to provide individual feedback, such as peer feedback between students (for an example, see Finkenstaedt-Quinn et al.15). For researchers, besides the evaluation of the steps in the reaction mechanisms and the captions, interviews or think-aloud studies would provide additional information on the process of writing down the steps and/or captions. To use the comics as a diagnostic tool, an automatic, computer-based evaluation of the written captions (for an example, see Wulff et al.29 or Dood et al.30) would be supportive as well. In summary, for teaching organic chemistry, the use of mechanism comics can be seen as offering added value, because not only do the students get an opportunity to actively think through the reaction steps and their underlying reasons, but the teacher also gets a relatively uncomplicated diagnostic tool

How Do Students Reason When They Have to Describe the “What” and “Why”

139

that provides interesting insights into students’ thinking, reasoning, and misconceptions.

Acknowledgements This project is part of the “Qualitätsoffensive Lehrerbildung,” a joint initiative of the Federal Government and the Länder which aims to improve the quality of teacher training. The programme is funded by the Federal Ministry of Education and Research. The authors are responsible for the content of this publication. We thank all students who participated in this study. We thank Prof. Dr Nicole Graulich and Irina Braun for inspiring discussions and useful tips. We thank Hilke Schulz for technical support.

References 1. J. Hermanns, J. Chem. Educ., 2021, 98, 374. 2. N. P. Grove, M. M. Cooper and K. M. Rush, J. Chem. Educ., 2012, 89, 844. 3. N. P. Grove and S. L. Bretz, Chem. Educ. Res. Pract., 2012, 13, 2013. 4. O. M. Crandell, H. Kouyoumdijan, S. M. Underwood and M. M. Cooper, J. Chem. Educ., 2019, 96, 213. 5. H. Sevian and V. Talanquer, Chem. Educ. Res. Pract., 2014, 15, 10. 6. M. M. Cooper, J. Chem. Educ., 2015, 92, 1273. 7. K. R. Galloway, C. Stoyanovich and A. B. Flynn, Chem. Educ. Res. Pract., 2017, 18, 353. 8. I. Caspari, M. L. Weinrich, H. Sevian and N. Graulich, Chem. Educ. Res. Pract., 2018, 19, 42. 9. S. A. Finkenstaedt-Quinn, A. S. Halim, G. Kasner, C. A. Wilhelm, A. Moon, A. R. Gere and G. V. Shultz, Chem. Educ. Res. Pract., 2020, 21, 922. 10. I. Caspari and N. Graulich, Int. J. Phys. Chem. Educ., 2019, 11, 31. 11. N. Graulich and I. Caspari, Chem. Teach. Int., 2021, 3, 19. 12. L. P. Rivard, J. Res. Sci. Teach., 1994, 31, 969. 13. J. A. Reynolds, C. Thaiss, W. Katkin and R. J. Thompson, CBE: Life Sci. Educ., 2012, 11, 17. 14. G. V. Shultz and A. R. Gere, J. Chem. Educ., 2015, 92, 1325. 15. S. A. Finkenstaedt-Quinn, E. P. Snyder-White, M. C. Connor, A. R. Gere and G. V. Shultz, J. Chem. Educ., 2019, 96, 227. 16. S. A. Finkenstaedt-Quinn, F. M. Watts, M. N. Petterson, S. R. Archer, E. P. Snyder-White and G. V. Shultz, J. Chem. Educ., 2020, 97, 1852. 17. T. Gupte, F. M. Watts, J. A. Schmidt-McCormack, I. Zaimi, A. R. Gere and G. V. Shultz, Chem. Educ. Res. Pract., 2021, 22, 396. 18. F. M. Watts, J. Schmidt-McCormack, C. Wilhelm, A. Karlin, A. Sattar, B. Thompson, A. R. Gere and G. V. Shultz, Chem. Educ. Res. Pract., 2020, 21, 1148. 19. R. S. Russ, R. E. Scherr, D. Hammer and J. Mikeska, Sci. Educ., 2008, 92, 499.

140

Chapter 8

20. S. Markic and P. E. Childs, Chem. Educ. Res. Pract., 2016, 17, 434. 21. J. Saldana, The Coding Manual for Qualitative Researchers, Sage Publications Inc. London, Great Britain, 2013. 22. M. E. Anzovino and S. L. Bretz, Chem. Educ. Res. Pract., 2015, 16, 797. 23. D. P. Cartrette and P. M. Mayo, Chem. Educ. Res. Pract., 2011, 12, 29. 24. M. M. Cooper, H. Kouyoumdijan and S. M. Underwood, J. Chem. Educ., 2016, 93, 1703. 25. U. Kuckartz, Qualitative Inhaltsanalyse. Methoden, Praxis, Computerunterstützung [Qualitative Content Analysis. Methods, Practice, Computer Assistance], Beltz, Weilheim and Basel, Germany and Switzerland, 2016. 26. N. Graulich, S. Hedtrich and R. Harzenetter, Chem. Educ. Res. Pract., 2019, 20, 924. 27. G. Bhattacharrya and G. M. Bodner, J. Chem. Educ., 2005, 82, 1402. 28. G. Bhattacharrya, Chem. Educ. Res. Pract., 2014, 15, 594. 29. P. Wulff, D. Buschhüter, A. Nowak, A. Westphal, L. Becker, H. Robalino, M. Sted and A. Borowski, J. Sci. Educ. Technol., 2020, 30, 1. 30. A. J. Dood, J. C. Dood, D. Cruz-Ramirez de Arellano, K. B. Fields and J. R. Raker, Chem. Educ. Res. Pract., 2020, 21, 267.

Chapter 9

In-the-moment Learning of Organic Chemistry During Interactive Lectures Through the Lens of Practical Epistemology Analysis KATIE H. Walsha, JESSICA M. Karcha and IRA Caspari-Gnann*a a

Tufts University, 419 Boston Ave, Medford, MA 02155, USA *E-mail: [email protected]

9.1  Introduction What is the current state of research on student reasoning in organic chemistry and what future directions do we propose? Our chapter approaches these big questions of the field from the perspective that although a growing body of literature explores how students reason in organic chemistry, less is known about what drives that reasoning in the moment, and how this affects learning. To address this challenge, we will briefly review literature that explores some core challenges for successful organic chemistry reasoning, where students can learn to master these challenges, and how learning in these contexts occurs. We will then introduce how practical epistemology   Advances in Chemistry Education Series No. 10 Student Reasoning in Organic Chemistry: Research Advances and Evidence-based Instructional Practices Edited by Nicole Graulich and Ginger Shultz © The Royal Society of Chemistry 2023 Published by the Royal Society of Chemistry, www.rsc.org

141

142

Chapter 9

analysis can be used to operationalize and investigate student in-themoment learning in organic chemistry. One core challenge of organic chemistry reasoning is how students meaningfully relate knowledge pieces. Students might reason based on rules or cases encountered previously, rather than abstracting from this prior knowledge to identify what is appropriate for the case at hand.1–3 While these ways of utilizing prior knowledge for reasoning about organic chemistry can be successful, it is challenging for students to use them flexibly and appropriately and not rely too heavily on matching superficial features of the problem at hand to previous examples.1–3 In addition to drawing on prior knowledge in ways that go beyond superficial similarities, students must also be able to incorporate and relate knowledge pieces in their reasoning beyond describing what happens in any given reaction to include how and why transformations occur in certain ways and not others.4–10 More specifically, this requires students to use explicit information represented in the problem context to infer underlying implicit information about electronic and energetic properties, use this information to predict activities molecules might undergo, and incorporate multiple alternatives.11–14 Many studies have found that the complexity of student reasoning varies between students and problem contexts.4–8,10,11,13,14 Common trends observed include that reasoning about how and why transformations occur is more challenging for students than describing what happens; inferring implicit information is more challenging than drawing on explicit information; and incorporating multiple alternatives adds to the complexity organic chemistry reasoning requires. Given that it is a core challenge for organic chemistry students to use prior knowledge and provided information in productive ways to relate knowledge pieces in complex reasoning networks, institutions and instructors must set up course environments that support students to master this challenge.15,16 To centre students’ reasoning, experiences, and challenges in the course environment, some institutions are adopting active learning pedagogies to engage students in the learning process while they are in the classroom, as opposed to the passive style of lecture that dominates traditional organic chemistry courses.16 Active learning has been shown to improve a variety of affective and cognitive student outcomes in organic chemistry.17–25 For example, increased class time for group learning and discussion may decrease students’ cognitive load, contributing to higher feelings of emotional satisfaction about organic chemistry content.23 Active learning strategies also lead to positive assessment outcomes, e.g., increases in course grades19,23 and decreases in failure/withdrawal rates.23,24 As organic chemistry often acts as a gatekeeper course that determines who is included and who is pushed out of STEM,23 active learning may be a strategy to support both student success in organic chemistry and STEM retention more broadly.26 Although studies examining the relationship between active learning and equity in organic chemistry are limited, studies in general chemistry show these implications are especially pertinent for students historically excluded from STEM, e.g., students of color, students from low socioeconomic backgrounds, and first-generation students.27

In-the-moment Learning of Organic Chemistry During Interactive Lectures

143

In organic chemistry, active learning strategies may include small group collaborative work, allowing students to practice applying new knowledge in a space with clear avenues for assistance, as opposed to focusing on content delivery in the classroom.25 This shifts the focus of class time to student peerto-peer interactions, prioritizing learning rather than teaching in the classroom. Many active learning environments also include more advanced students as facilitators for small group discussions, increasing the number of interactions students have with someone other than a peer.28 One such model is the learning assistant model, which originated in physics classrooms29 and has since been implemented in organic chemistry courses at multiple institutions.25,30 Learning assistants (LAs) are advanced undergraduate students who receive training in a pedagogy course, meet with the instructional team, and support student learning in active learning environments by facilitating small group discussions. To understand what student groups do in active learning classrooms that makes their collaboration worthwhile, and to begin to relate students’ interactions to their learning, we turn to literature on what happens during student group discussions.31–34 For instance, Christian and Talanquer31 characterized three types of social interactions in organic chemistry study groups related to different ways of knowledge building: teaching (one student explains to the others), tutoring (students ask questions and one group member answers), and co-construction (equal participation in knowledge building). Kulatunga et al.33 found similar patterns and extended these findings to explore the relationship between students’ participation and the nature of their arguments more deeply. For example, students specifically engaged in coconstruction crafted higher-level arguments that were more often backed with reasoning.33 Pabuccu and Erduran’s34 work supports this as well, finding that more collaborative social interactions lead to higher-level arguments, particularly in high-performing groups. While most of this work focuses on peer–peer collaboration, Kulatunga and Lewis32 examined how interactions with advanced peer leaders affect student argumentation. Specifically, they found that a combination of short questions and probing and clarifying behaviours from peer leaders led to effective student argumentation.32 In the brief review above, we showed that active learning approaches can support student reasoning in organic chemistry, and that interaction patterns connect to some aspects of learning. However, little literature has bridged these two to examine how student in-the-moment learning progresses during these social interactions, focusing instead on either the nature of student reasoning or the nature of student interaction. In the next section, we will introduce how practical epistemology analysis can be used as a tool to analyze collaborative student discussions to reveal deep insights into the nature of student in-the-moment learning in organic chemistry contexts.

9.1.1  Practical Epistemology Analysis (PEA) Practical epistemology analysis (PEA) is grounded in a sociocultural approach to epistemology.35–37 In science education, there are three major strands of epistemological research: the study of the nature of disciplinary knowledge;

144

Chapter 9

personal epistemologies, e.g., knowledge as situated and contextually bound mental entities; and epistemology as social practice.35 PEA falls under the final category, in which “knowledge is seen as a competent action in a situation rather than as correct, static representations of the world” (p. 286), i.e., as a shared social practice.35 Recent work that focuses on the activation of epistemic resources38,39 and epistemologies in practice40 can be seen as a bridge from personal epistemologies to epistemology as social practice,35 as they focus on how resources in individual students’ minds (closer to personal epistemologies)40 get activated under certain frames (closer to social practice).38 PEA, on the other hand, focuses on “how [students] get knowledge as acting participants” (p. 327) in the learning environment, which conceptualizes learning as contingent and inextricable from sociocultural circumstances.36 The object of PEA is thus not learning as change of resource activation in individual minds, but rather learning as relational change in discourse.37 In PEA, knowledge is conceptualized as what is immediately intelligible for participants in the interaction. Learning occurs when participants recognize a need to construct a new shared meaning, i.e., recognize a gap, by making new relations between knowledge pieces.37 Note that gaps are not what is traditionally understood as gaps in knowledge, but rather as participants’ need to make something intelligible to move forward in the task. Through operationalizing in-the-moment learning as noticing and filling gaps with relations, i.e., how students identify what they need to make clear to move forward, and how they do so, PEA directly relates to work focusing on how students relate knowledge pieces during reasoning. However, rather than providing a descriptive account of those relations, PEA allows for an analysis of how students come to know, i.e., what direction learning takes and what drives the formation of new relations between knowledge pieces. Recently, PEA has been used as a productive tool to investigate student in-the-moment learning in secondary and undergraduate chemistry classrooms.41,42 Hamza and Wickman41 looked at how students connect features in a problem to generalized explanations to make sense of galvanic cells. Manneh and collaborators42 examined student–tutor interactions to better understand the tutor’s role in problem solving while students reasoned about molecular structures. In this work, we want to demonstrate how PEA can move research on student reasoning in organic chemistry from attending to what meaningful relations students construct between pieces of knowledge (an ontological question) towards understanding what drives students to build those relations during their group learning (an epistemological question), to better understand how in-the-moment learning occurs. This leads us to the following research questions:    1. What drives students’ in-the-moment learning in LA-facilitated organic chemistry problem solving? 2. How do group members in LA-facilitated interactions construct new relations between knowledge pieces?

In-the-moment Learning of Organic Chemistry During Interactive Lectures

145

9.2  Methodology This exploratory multiple-case study is part of a larger project aimed at characterizing LAs’ facilitation practices in physics and chemistry. In the analysis presented here, PEA was applied to in-class group discussions of organic chemistry content in a general chemistry course to explore how PEA can provide insight into students’ in-the-moment learning in organic chemistry.

9.2.1  Study Context The study was conducted in a second-semester non-major general chemistry course at a primarily white, private research-intensive university in the north-eastern United States. The course used the Chemical Thinking curriculum, which is structured around big questions chemistry can answer, such as questions about chemical causality and mechanism, rather than chemistry as a “body of knowledge.”43 As it was held during COVID it required remote learning and classes took place over Zoom. The large lecture class was supported by nine LAs, five of whom participated in the IRB-approved study. Each class included 2–3 breakout room sessions, during which students worked in small, pre-assigned groups of 3–5 students on collaborative problem solving. Each LA was assigned a certain number of breakout rooms and moved between rooms to facilitate discussion. 130 of 140 students enrolled in the course participated in the study. Participants (LAs and students) were asked to complete an optional demographics survey, which all five participating LAs and 86 of the total 130 participating students completed. Of LAs, demographics indicate that 60% identified as female and 40% identified as male and all five identified as white (with one LA specifying their identity as both white and Ashkenazi). Of the 86 students, 68.6% identified as female and 31.4% identified as male. 4.6% identified as Latino/Latinx, 8.1% as Black, 17.4% as Asian, and 53.5% as white. 8.3% of the students selected more than one ethnicity, 7.1% preferred to selfdescribe, and 1.2% chose not to disclose this information.

9.2.2  Data Collection The data presented here were collected during a lesson on predicting relative acidity of different protons in organic compounds, an elaboration on the acid-base unit that typically takes place in later organic chemistry courses but can successfully be implemented earlier in general chemistry.44 During two breakout room sessions, students were tasked with identifying the most acidic proton on two different sets of organic molecules (Figure 9.1). Groups were assigned such that all students in a group were either consented or non-consented and consented LAs were pre-assigned to consented groups. LAs were given co-host privileges to record their interactions in the breakout rooms. Each of the five consenting LAs recorded both of their breakout rooms, leading to a total of ten recorded small group discussions.

Chapter 9

146

Figure 9.1  The  two problems (A and B) that students worked on during the

recorded lecture including the four molecular structures that are referenced in the text as follows: phenol (A, left), carboxylic acid (A, right), diol (B, left) and trifluoromethylated diphenol (B, right).

9.2.3  Data Analysis Breakout room video recordings were transcribed verbatim. Descriptions of salient gestures, non-verbal cues, or interactions with the course material were included in the transcripts. Data were analyzed independently by the first and last authors, who discussed their coding until they reached 100% consensus. Following the principles of PEA, each interaction was analyzed to identify (1) the gaps that were noticed, i.e., the questions “in the air” that the participants needed to make intelligible to move forward; and (2) how the gaps were filled, i.e., how the participants constructed relations between pieces of knowledge.36,37 We also attended to who filled the gaps, i.e., the LA or the students, and which students contributed which relations. Table 9.1 demonstrates how two different transcript excerpts translated into the constructs of PEA. There are two general ways of how a gap can be noticed and filled: (1) through the explicit posing of a question, which is then answered by someone in the interaction (Table 9.1, excerpt 1); or (2) through the implicit posing of a question, which was recognized primarily through its being answered (Table 9.1, excerpt 2).36,37 Note that for a gap to be noticed, it does not need to be posed as a question. Rather, students can construct relations that answer an implicit, unspoken question, such as Reena in utterance 02 (Table 9.1). We inferred that Reena was answering an implicit question distinct from the question on the slide, because the relations she provided answered the question of which molecule is more acidic, whereas the task as written asks which proton in each molecule is more acidic. This was then made explicit by the LA in utterance 03.

9.3  Results and Discussion This exploratory multiple-case study allowed us to identify salient ways of how in-the-moment learning understood through PEA can manifest in student interaction data about organic chemistry problems during interactive

In-the-moment Learning of Organic Chemistry During Interactive Lectures

147

Table 9.1  Excerpts  of student–student and LA–student interactions and corresponding practical epistemology analysis. Relations between pieces of knowledge are displayed with dashes.

Excerpt Excerpt 1 14 Shakti: But isn’t the alkyl group donating?

Practical epistemology analysis Gap noticed (Shakti, Reshmi) (utterances 14–15): Are alkyls groups withdrawing or donating/injecting?

15 Reshmi: Are alkyl groups injecting or withdrawing? Gap noticed and filled (Shakti) (utter16 Shakti: Donating. So, alkyl groups are ance 16): alkyl groups are electron electron donating, and they localize the donating—localize the negative negative charge, and then high eleccharge—high electronegativity tronegativity groups that are electron groups—electron withdrawing— withdrawing and delocalize the negative delocalize negative charge—increase charge, which would increase the acid acid strength strength. Excerpt 2 02 Reena: I think it would be the first one Gap noticed and filled (Reena, LA Oliver) (utterances 02–04): Which [left structure in Figure 9.1A], because molecule is more acidic? first there’s more options for where the doumolecule—there is more options ble bonds could be, so then it’s even for where the double bond could more delocalized than this one [right be—even more delocalized than structure in Figure 9.1A] where it’s only the other one—double bonds only between two places. between two places—comparing the left structure to the right structure 03 Oliver (LA): Can you explain that a little more? Are you comparing the left structure to the right structure? 04 Reena: Yeah.

lectures facilitated by LAs. In line with our two research questions, the results are divided into two sections. Section 9.3.1 addresses the question of what drives students’ in-the-moment learning by attending to the needs, i.e., gaps, that drive the conversation. Specifically, this section focuses on how students’ needs to make something intelligible build on and relate to each other. Section 9.3.2 addresses the question of how group members construct relations in the process of in-the-moment learning. Specifically, this section focuses on the different ways group members relate each other’s knowledge pieces.

9.3.1  W  hat Drives Student In-the-moment Learning—Gap Patterns In all ten interactions in our data set, students and/or their LAs noticed more than one gap. We found two major patterns: (1) Subsequent gaps were noticed when a participant in the conversation felt that the relations built to fill the

148

Chapter 9

need of prior gaps warranted further questions, e.g., to better understand what was established previously, to address the organic chemistry problem more completely, or to deepen the learning. (2) Sub-gaps were opened when an overarching gap lingered, and the participants were unsure how to fill the need of this gap. A sub-gap or multiple sub-gaps then supported the process of filling the overarching gap. To our knowledge, this distinction between subsequent and sub-gaps has not yet been described in the PEA literature and thus adds to our understanding of what drives students’ in-the-moment learning. To visualize patterns of gaps and the relations constructed to fill them, we use a canyon and bridge analogy (Figures 9.2 and 9.3). In this analogy, a canyon symbolizes a gap. Knowledge pieces used to bridge the canyon, i.e., fill the gap, are represented by short words or phrases in boxes, which correspond to students’ main ideas, and the connections of which imply relations. While subsequent gaps (pattern 1) are displayed at the same level across a canyon (e.g., Figure 9.2A and B), sub-gaps that help fill an overarching gap (pattern 2), are depicted one level down (e.g., Figure 9.2C and D). Note that to word gaps and relations, we summarized students’ language to be more succinct and to capture core ideas. Below, we show two examples of each pattern to illustrate the variety in our data and to demonstrate how students and LAs introduce subsequent gaps, overarching gaps, and sub-gaps.

9.3.1.1 Pattern 1 Noticing subsequent gaps can drive the conversation forward, either towards a point of confusion or to challenge previously established relations. Students Ben and Lucas were discussing what makes a proton more acidic in the trifluoromethylated diphenol (Figure 9.1B, right) with their LA, Leah. After the students moved through a total of four gaps to establish some tentative answers to the question, Lucas and Ben asked a follow-up question (see gap 5 and gap 6 in Figure 9.2A). First, Ben opened gap 5 to ask how acid strength can be tied to characteristics of the molecule:   

  

Ben: So, when an acid is reacting, the stronger the acid, the more localized the charge? Is that true?

Leah (LA): So like a lot of the stuff, we’re doing here is looking at the conjugate base, and we want the charge to be delocalized for the conjugate base, so that increases stability. So, we’re not really like looking at the actual acid. We’re looking at what happens with the conjugate base on the other side of the reaction.    Here, Leah filled the gap for the students. Her answer prompted Lucas to clarify how to predict the conjugate base, opening gap 6:    Lucas: But in order to do that, you have to figure out the more acidic, like, which proton’s leaving, right?   

Leah (LA): Yeah. But the most acidic protons, you can still decide that by looking at the conjugate base. [ … ]   

In-the-moment Learning of Organic Chemistry During Interactive Lectures

149

Figure 9.2  Exemplars  of the two gap patterns: subsequent gap pattern (A and B) and gap–sub-gap pattern (C and D). Student-opened gaps are indicated by a purple icon, and LA-opened gaps by an orange icon.

This illustrates two successive gaps, where participants felt that the relations made to fill the need of previous gaps, i.e., gaps 5 and prior, warranted further questioning to better understand what had been previously established. For example, when Leah used the piece “conjugate base” to fill gap 5, Lucas experienced a need to understand how he might look at a conjugate

150

Chapter 9

Figure 9.3  Exemplars  of the two relation patterns: separate pattern (A) and collaborative pattern (B). Student-contributed relations are indicated by a purple box, and LA-contributed relations by an orange box.

base without knowing which proton is most acidic and would thus be removed. Although here subsequent gaps were opened by students and filled by the LA, like the tutoring pattern described by Christian and Talanquer,31 subsequent gaps could also be opened by the LA. In the following example, LA Amelia noticed subsequent gaps to prompt students to elaborate on their decisions as they worked on problem A (Figure 9.1A). The student group had agreed that one proton in the phenol’s hydroxyl groups (Figure 9.1A, left) and the proton in the carboxylic acid’s carboxyl group (Figure 9.1A, right) were the most acidic in each molecule. They came to this conclusion by progressing through three gaps, in which they only focused on protons bound to oxygen atoms. To problematize this, Amelia opened a fourth gap in which she contrasted the hydroxyl protons with the aromatic protons in the phenol and encouraged the students to explain the chemistry behind their decision (see Figure 9.2B):    Amelia (LA): So then, what about all the H’s inside the ring? Why do you think those are not, what makes these not as good to take off? [marks the aromatic protons with cursor]    Through noticing this gap, Amelia challenged the focus on only oxygenbound protons. She repeated this tactic when opening gap 5 (Figure 9.2B), asking the students to explain how they might decide between the proton in the carboxyl group and the alpha protons of the keto group at the other end

In-the-moment Learning of Organic Chemistry During Interactive Lectures

151

of the carboxylic acid (Figure 9.1A, right), both of which, if removed, would allow for resonance of the negative charge:    Amelia (LA): And there’s a double bonded O here [points with her cursor on the keto group]. So, what about resonance that could come on this side [points with her cursor to the alpha protons]? Why is this resonance more stable [points with her cursor on the OH of the carboxyl group] than the one over here [points with her cursor on the keto group]?    Like the interaction with Leah, this example demonstrates that subsequent gaps can be opened when prior relations come into question, creating a need to make something new intelligible. In this case, those relations came from earlier student utterances, which Amelia problematized by challenging the group’s focus on oxygen-bound hydrogens. This pattern of subsequent gaps pushing the conversation forward emerged regardless of who opened those gaps—here, they were noticed by an LA, while in the previous interaction they were noticed by students. Recall, that in-the-moment learning is understood as constructing new relations between pieces of knowledge to make something intelligible that was not previously immediately intelligible. The distinction between the two examples we have shown here is thus who pushes the learning forward, and what drives them to do so: for Amelia’s group, the LA’s facilitation is the driving force of the in-the-moment learning, whereas for Leah’s group, students’ uncertainty drives the learning.

9.3.1.2 Pattern 2 The second gap pattern involves an initial overarching gap that is made intelligible through filling smaller sub-gaps. These sub-gaps are introduced while the initial gap lingers, i.e., when it is not filled, and students may be unsure how to proceed. Sub-gaps are then introduced to help fill the overarching initial gap, and thus help in-the-moment learning. For example, students working with LA Harper were trying to answer their principal question of which proton is more acidic in a diol (Figure 9.1B, left). After the group decided that (1) they wanted to focus on the hydroxyl groups, but (2) could not discern a difference between said hydroxyls, LA Harper opened a sub-gap (see gap 2 in Figure 9.2C) to draw the students’ attention toward explicit features on the molecule:    Harper (LA): So then, considering, what was on that last slide, what is bound to the carbon adjacent to say the oxygen of each? Was anything about those groups [inaudible]? It might be easiest to do one at a time kind of thing.    Here, Harper built on the overarching gap of ranking proton acidity by leading the students toward useful information to help them come to an answer. She opened gap 2 to draw students’ attention toward the explicit

152

Chapter 9

differences between the alpha carbons. This pattern also arose in other similar situations, when an LA broke down gaps into smaller pieces to help students make sense of and address the overarching gap. Though LAs often-noticed sub-gaps as a means of providing additional entry points to the problem, students could also open sub-gaps. Mai and Minh co-constructed sub-gaps when working with LA Oliver to determine the most acidic proton in a diol (Figure 9.1B, left). Initially, the group struggled to select a proton, and Minh noted that there were no resonance structures that seemed particularly favourable. Oliver prompted them with a sub-gap (see gap 6 in Figure 9.2D) to clarify whether there were any particular hydrogens they were considering:    Oliver (LA): Okay. So, you’re not seeing any resonance structures. Are there maybe two or more hydrogens that you’re trying to decide between here?    Here, Oliver noticed that the students were struggling with the broader initial gap and narrowed his line of questioning by introducing a sub-gap. This resembled Harper’s tactic, where she noticed a similar problem and addressed it by narrowing the students’ focus. Where this interaction differs is its inclusion of sub-gaps noticed by students—after clarifying that they were attending to the two hydroxyl groups, Mai introduced a new gap, gap 7 (Figure 9.2D) to address the still-lingering overarching gap:    Mai: Does it have something to do with the amount of CH3 groups surrounding the OH, maybe?    Here, Mai followed the gap-sub-gap pattern by opening a new gap to help her group fill the overarching gap by considering the structure adjacent to the functional groups. When sub-gaps were noticed by both LAs and students, different members of the interaction contributed to finding ways that might help satisfy an overarching need, e.g., by negotiating and co-constructing questions.31 Focusing on the patterns of gaps during student in-class discussions about organic chemistry problems can guide the field towards identifying how needs build on each other and drive in-the-moment learning forward. In the next section, we will discuss different ways students relate knowledge pieces during in-the-moment learning in collaborative problem solving.

9.3.2  H  ow Students Learn In-the-moment of Group Discussions—Relation Patterns We found two patterns of how students related knowledge pieces during interactions with each other: separately and collaboratively. When students related knowledge pieces separately, group members reasoned in parallel, sometimes coming to different conclusions. To figure out which proton in

In-the-moment Learning of Organic Chemistry During Interactive Lectures

153

each molecule is most acidic in the diol (Figure 9.1B, left), Ha and Imani took turns throughout the conversation to relate knowledge pieces separately. At the beginning of their discussion (gap 1 and 2 in Figure 9.3A), Ha brought in the knowledge piece that a methyl group increases negative charge localization, and Imani drew attention to the visual disconnectedness of a hydroxyl group from the rest of the molecule:    Ha: Well, I see the CH3 group, which is the alkyne group, which is likely to increase the localization, I mean, increase localization of negative charge.      

Amelia (LA): Do you agree with that?

Imani: Well, I was basing it off the last question, so I just thought it will be like that HO right there, that would be one of the acidic protons. The ones that are kind of disconnected from the rest of the molecule.    As the conversation progressed, the LA noticed and both students filled several gaps (see Figure 9.3A), however, Imani and Ha only related new knowledge pieces to their own starting points. For example, when asked which of the two hydroxyl groups was more acidic (see beginning of gap 3 in Figure 9.3A), Ha related her idea of negative charge localization to choosing the top hydroxyl group (Figure 9.1B, left):    Ha: Okay. Would it be the top?      

[…]

Ha: Cause this [CH3 group on the left of the diol] will add to the negative charge. Wait, no. Actually, I’m not sure.    While Ha constructed relations between the knowledge pieces “CH3 increases localization of negative charge”, “CH3 adds to negative charge”, and her choice of the top hydroxyl group as most acidic, Imani further elaborated on her starting point of the spatial disconnectedness of the hydroxyl groups. When asked whether one of the oxygens in the diol had a more delocalized charge, she attended to its placement on the structure (gap 6 in Figure 9.3A):    Imani: Well, I mean, because one of them is connected—I don’t really know how to phrase it. One of them is less of a charge connected to it, kind of. I really don’t know how to explain it. Cause the first one, what’s on top is next to a CH, and then the other one is next to just carbon, so I guess, I don’t know that it really makes a difference.    Imani constructed relations between her ideas of spatial disconnectedness, the amount of charge being connected, and the connectivity of the hydroxyl groups to CH vs. just carbon. Both students related multiple pieces of knowledge to fill the overarching gap, however, their lines of reasoning were separate

154

Chapter 9

from each other. This carried through to the end of the discussion, when Imani and Ha came to different conclusions about the most acidic proton:    Imani: So, if you want delocalized charges, then it’ll be the bottom H, I think.      

Amelia (LA): Is that what you’re thinking, Ha?

Ha: I’m thinking it might be the top, but I’m not sure, cause I’m thinking the CH3 will add the negative charge, which will localize electron.    Though both students extended their reasoning throughout the interaction, they only connected pieces that they themselves introduced. Thus, in-the-moment learning only happened for each student individually. Despite the interaction seeming collaborative because both students contributed to the discussion, there is no evidence they learned from each other, because their change in discourse, which PEA conceptualizes as learning, did not incorporate each other’s contributions. In contrast, the second relation pattern demonstrated continuity through multiple students co-constructing a connected network of relations. After reasoning about the diol, Kavita, Mia, and Sofia moved forward to discuss the most acidic proton in the trifluoromethylated diphenol (see gap 5 in Figure 9.3B). As they worked together, they established relations collectively by building on each other’s thoughts. They first established a relation between not seeing a difference and both hydroxyl groups being attached to opposite sides of the same ring:    Kavita: Well with the next one, I really don’t see—Oh. Wait, yeah. No, I really don’t see a difference.

  

Mia: Yeah, me too. They’re both attached—They’re both on opposite sides of the same ring.    Even while expressing confusion that there seemed to be no defining traits to differentiate the protons, Kavita and Mia built relations together. When the students moved to refine this original understanding, eventually selecting the proton on the hydroxyl near the trifluoromethyl group (see Figure 9.1B, right), they continued to co-construct relations to reason about this choice:    Kavita: [ … ] Oh, wait, no, this one’s closer [indicating that the left hydroxyl is closer to CF3]. And that one’s closer [indicating that the right hydroxyl is closer to the R group]. Sorry, my bad.   

  

Sofia: I feel like the one on the left is closer to fluorines, which are electron withdrawing.

Kavita: So, would that make this one [indicating the proton in the leftmost hydroxyl] more likely, because then it would increase the pull, I guess, with the fluorine group?   

In-the-moment Learning of Organic Chemistry During Interactive Lectures

155

Kavita and Sofia co-constructed relations to reason about their answer and fill the gap. When one student introduced an idea, the other built on it— when Sofia pointed out that the fluorines were electron withdrawing, Kavita drew on her own understanding of electron-withdrawing groups (“it would increase, like, the pull”) to claim that the leftmost hydroxyl proton was likely the most acidic. This comparison between two different relation patterns highlights how PEA can be used to track how different knowledge pieces are related during active learning, providing a tool to discern the continuity and discontinuity of different ideas, serving as a measure of collaboration on a knowledge-piece level.

9.4  Conclusions and Implications A focus on gaps and relations can be used to understand how student in-the-moment learning occurs during organic chemistry activities. This can be used to create a close connection between the disciplinary substance of student thinking, i.e., the organic chemistry content referenced in their gaps or relations, and the social aspect of learning, i.e., what needs drive the conversation and who makes which ideas continuous. Parsing out what and whose needs drive student in-the-moment learning and who relates which knowledge pieces to each other can be a way for instructors to reflect on how they facilitate group discussions and the types of questions they ask when talking to students. Our future work will use PEA to relate typical interaction patterns between LAs and students to effective and ineffective learning moments. For example, Harper (Figure 9.2C, gap 2) and Oliver (Figure 9.2D, gap 6) narrowed students’ focus to specific molecular features at a point where students experienced a need for support to answer a more overarching question. Our future research will explore whether these types of questions might be more effective in such a scenario, than if they are used when this overarching gap is absent. Layering other foci in organic chemistry education research on this type of analysis might lead to a multitude of different avenues. For example, one might wonder how instructors, including LAs, use the design features of contrasting cases in their facilitation.15,45 LA Amelia (Figure 9.2B) used contrasts to challenge student decisions and prompt elaboration, and LA Harper asked about the differences in adjacent groups to help students fill a more overarching need (Figure 9.2C). Or one might layer a mechanistic framework11–14 on the analysis presented here, which, for example, reveals that the group interaction featured in Figure 9.2D and that featured in Figure 9.3B had very different dynamics (i.e., multiple sub-gaps being used to fill the need of an overarching gap vs. students collaboratively building a network of relations to fill one gap), while overall leading to a similar progression through elements of comparative mechanistic reasoning (i.e., starting with explicit differences, then transitioning to implicit properties and their effect on charge through the removal of a proton). Or one might use the framework of meaningful learning to think

156

Chapter 9

about the role continuity plays in how students integrate prior knowledge and navigate social interactions,46–48 e.g., to make sense of the difference between how Ha and Imani built relations and how Kavita, Mia, and Sofia did so. These are just some examples hinting towards the high potential of PEA to parse in-class interaction data in a way that fosters productive analysis of how students learn in-the-moment to reason in organic chemistry.

Acknowledgements The authors are grateful to the members of the Caspari group who helped refine our use of PEA, in particular Jennifer Pierre-Louis, Andrew D’Amico, and Destiny Strange. This material is based upon work supported by the National Science Foundation under Grant No. DUE-2000603.

References 1. K. Christian and V. Talanquer, Chem. Educ. Res. Pract., 2012, 13, 286–295. 2. A. Kraft, A. M. Strickland and G. Bhattacharyya, Chem. Educ. Res. Pract., 2010, 11, 281–292. 3. M. L. Weinrich and H. Sevian, Chem. Educ. Res. Pract., 2017, 18, 169–190. 4. N. E. Bodé, J. M. Deng and A. B. Flynn, J. Chem. Educ., 2019, 96, 1068–1082. 5. M. M. Cooper, H. Kouyoumdjian and S. M. Underwood, J. Chem. Educ., 2016, 93, 1703–1712. 6. O. M. Crandell, H. Kouyoumdjian, S. M. Underwood and M. M. Cooper, J. Chem. Educ., 2019, 96, 213–226. 7. O. M. Crandell, M. A. Lockhart and M. M. Cooper, J. Chem. Educ., 2020, 97, 313–327. 8. J. M. Deng and A. B. Flynn, Chem. Educ. Res. Pract., 2021, 22, 749–771. 9. H. Sevian and V. Talanquer, Chem. Educ. Res. Pract., 2014, 15, 10–23. 10. M. L. Weinrich and V. Talanquer, Chem. Educ. Res. Pract., 2016, 17, 394–406. 11. I. Caspari, D. Kranz and N. Graulich, Chem. Educ. Res. Pract., 2018, 19, 1117–1141. 12. I. Caspari, M. L. Weinrich, H. Sevian and N. Graulich, Chem. Educ. Res. Pract., 2018, 19, 42–59. 13. F. M. Watts, J. A. Schmidt-McCormack, C. A. Wilhelm, A. Karlin, A. Sattar, B. C. Thompson, A. R. Gere and G. V. Shultz, Chem. Educ. Res. Pract., 2020, 21, 1148–1172. 14. F. M. Watts, I. Zaimi, D. Kranz, N. Graulich and G. V. Shultz, Chem. Educ. Res. Pract., 2021, 22, 364–381. 15. J. Eckhard, M. Rodemer, A. Langner, S. Bernholt and N. Graulich, Chem. Educ. Res. Pract., 2022, 23, 78–99. 16. J. B. Houseknecht, A. Leontyev, V. M. Maloney and C. O. Welder, in Active Learning in Organic Chemistry: Implementation and Analysis, American Chemical Society, Washington, DC, 2019, pp. 1–17.

In-the-moment Learning of Organic Chemistry During Interactive Lectures

157

17. D. A. Canelas, J. L. Hill and A. Novicki, Chem. Educ. Res. Pract., 2017, 18, 441–456. 18. A. Chase, D. Pakhira and M. Stains, J. Chem. Educ., 2013, 90, 409–416. 19. M. T. Crimmins and B. Midkiff, J. Chem. Educ., 2017, 94, 429–438. 20. A. B. Flynn, in Online Approaches in Chemical Education, American Chemical Society, Washington, DC, 2017, pp. 151–164. 21. S. M. Hein, J. Chem. Educ., 2012, 89, 860–864. 22. Y. Liu, J. R. Raker and J. E. Lewis, Chem. Educ. Res. Pract., 2018, 19, 251–264. 23. S. R. Mooring, C. E. Mitchell and N. L. Burrows, J. Chem. Educ., 2016, 93, 1972–1983. 24. J. C. Shattuck, J. Chem. Educ., 2016, 93, 1984–1992. 25. C. O. Welder, in Active Learning in Organic Chemistry: Implementation and Analysis, American Chemical Society, Washington, DC, 2019, pp. 119–148. 26. K. N. White, K. Vincent-Layton and B. Villarreal, J. Chem. Educ., 2021, 98, 330–339. 27. S. F. Bancroft, S. R. Fowler, M. Jalaeian and K. Patterson, J. Chem. Educ., 2020, 97, 36–47. 28. H. E. Jardine and L. A. Friedman, J. Chem. Educ., 2017, 94, 703–709. 29. V. Otero, S. Pollock and N. Finkelstein, Am. J. Phys., 2010, 78, 1218–1224. 30. S. M. Ruder and C. Stanford, J. Chem. Educ., 2018, 95, 2126–2133. 31. K. Christian and V. Talanquer, Int. J. Sci. Educ., 2012, 34, 2231–2255. 32. U. Kulatunga and J. E. Lewis, Chem. Educ. Res. Pract., 2013, 14, 576–588. 33. U. Kulatunga, R. S. Moog and J. E. Lewis, J. Res. Sci. Teach., 2013, 50, 1207–1231. 34. A. Pabuccu and S. Erduran, Int. J. Sci. Educ., 2017, 39, 1154–1172. 35. G. J. Kelly, S. McDonald and P. O. Wickman, in Second International Handbook of Science Education, ed. K. Tobin, B. J. Fraser and C. J. McRobbie, Springer, Dordrecht, The Netherlands, 2012, pp. 281–291. 36. P.-O. Wickman, Sci. Educ., 2004, 88, 325–344. 37. P.-O. Wickman and L. Östman, Sci. Educ., 2002, 86, 601–623. 38. A. Elby and D. Hammer, in Personal Epistemology in the Classroom: Theory, Research, and Implications for Practice, ed. L. D. Bendixen and F. C. Feucht, Cambridge University Press, Cambridge, UK, 2010, vol. 4, pp. 409–434. 39. D. Hammer and A. Elby, in Personal Epistemology: The Psychology of Beliefs about Knowledge and Knowing, ed. B. K. Hofer and P. R. Pintrich, Erlbaum, Mahwah, NJ, 2002, pp. 169–190. 40. L. K. Berland, C. V. Schwarz, C. Krist, L. Kenyon, A. S. Lo and B. J. Reiser, J. Res. Sci. Teach., 2016, 53, 1082–1112. 41. K. M. Hamza and P.-O. Wickman, Sci. Educ., 2013, 97, 113–138. 42. I. A. Manneh, C.-J. Rundgren, K. M. Hamza and L. Eriksson, Int. J. Sci. Educ., 2018, 40, 2023–2043. 43. V. Talanquer and J. Pollard, Chem. Educ. Res. Pract., 2010, 11, 74–83.

158

Chapter 9

44. L. Shah, C. A. Rodriguez, M. Bartoli and G. T. Rushton, Chem. Educ. Res. Pract., 2018, 19, 543–557. 45. N. Graulich and M. Schween, J. Chem. Educ., 2018, 95, 376–383. 46. K. R. Galloway, M. W. Leung and A. B. Flynn, J. Chem. Educ., 2018, 95, 355–365. 47. N. P. Grove and S. Lowery Bretz, Chem. Educ. Res. Pract., 2012, 13, 201–208. 48. T. Gupte, F. M. Watts, J. A. Schmidt-McCormack, I. Zaimi, A. R. Gere and G. V. Shultz, Chem. Educ. Res. Pract., 2021, 22, 396–414.

SECTION C

Chapter 10

Flipped Classrooms in Organic Chemistry—A Closer Look at Student Reasoning Through Discourse Analysis of a Group Activity SUAZETTE R. Mooring*a, NIKITA L. Burrowsb and SUJANI Gamagea a

Georgia State University, Atlanta, Georgia, USA; bMonmouth University, West Long Branch, New Jersey, USA *E-mail: [email protected]

10.1 Introduction Organic chemistry is a challenging course. Student difficulties with organic chemistry concepts1 and the high withdrawal and failure rates2 are well documented in the literature. To alleviate some of these challenges and to improve students' motivation in the course, some organic chemistry educators have turned to active learning approaches. Active learning has been shown to improve student outcomes and benefit students traditionally underrepresented in science, technology, engineering, and mathematics (STEM) disciplines.3,4 One approach that has gained traction in chemistry courses is   Advances in Chemistry Education Series No. 10 Student Reasoning in Organic Chemistry: Research Advances and Evidence-based Instructional Practices Edited by Nicole Graulich and Ginger Shultz © The Royal Society of Chemistry 2023 Published by the Royal Society of Chemistry, www.rsc.org

161

162

Chapter 10

flipped learning, in which some or most of the lecture material is moved online. Students typically review the content videos before class, while class time is reserved for problem-solving, group work, and enhanced discussions. In general, studies of flipped organic chemistry courses have found that failure and withdrawal rates decrease.5–7 For example, Flynn reported that student achievement increased in organic chemistry courses as evidenced by increased students' grades and decreased failure rates.6 In the same course, the withdrawal rates were also reduced when compared to previous years. Similar results were separately reported by Fautch5 and Mooring et al.7 Student perceptions and attitudes were also positively impacted by the flipped organic chemistry classrooms.7,8 Rau et al. found that active learning had positive effects on students' ratings of the textbook.9 Shattuck reported that students commented on their development of transferable skills due to the flipped classroom. For example, students reported improvement in their comfort level in using technology outside of class, working in groups, and participating in active learning activities.10,11 Furthermore, Shattuck reported that students felt that the flipped class improved their critical thinking, problem-solving ability, connection with course concepts and transferring knowledge to future courses.10,11 In the section below, we examine some key components of the pre-class and in-class activities of flipped classrooms and their reported impact on student outcomes in organic chemistry.

10.1.1 Pre-class Activity—Videos One of the prominent components of the flipped classroom involves moving lecture information outside of the classroom. The primary form of presentation in most implementations was online videos. Students are typically required to watch the videos before class. Seery12 and Christiansen13 both reported that students consistently viewed the videos before in-class activities. A recent study by Casselman found that the asynchronous online learning environment (pre-class videos) played a more significant role in student learning when compared to in-person collaborative group work.14 The online learning component also appeared to account for most of the improvement in post-test scores observed in the flipped classroom treatment.14 Students listed the pre-class videos of the flipped classroom as an important contributor to their performance.7,14,15 Shattuck used student survey responses and focus groups to gather student comments about the flipped classroom.10,11 Students' comments often stated that one of the major benefits of flipping was the ability to re-watch the videos in studying, pause them as needed, and watch them at a time of day when they were mentally alert. Mooring et al. reported similar comments from students, where the majority of positive comments referred to students' ability to re-watch and clarify concepts using the videos.7 In Rossi's study, students enjoyed the on-demand lecture videos in conjunction with the increased structure of the course.16 Though these studies reflect students’ positive perceptions of video lectures,

Flipped Classrooms in Organic Chemistry—A Closer Look at Student Reasoning

163

it is still unclear how the delivery of video lectures affect student learning and reasoning about organic chemistry concepts.

10.1.2 Pre-class Activity—Quizzes In a few studies, students were asked to answer low stake quiz questions after or during the pre-class video. Pre-class quizzes were used for a variety of reasons which include: the promotion of notetaking,17,18 facilitating better engagement with the content,7,13,15,19 providing evidence of gained knowledge to students,6 guiding students to important learning objectives, and guiding mini-lectures given by instructors.5,6,20 Quizzes were either given online directly after watching a video,5,7 right before beginning in-class activities,6,18 or a combination of both.21 Learning management systems were typically used to administer pre-class quizzes. Most studies have used quizzes as a follow-up to the pre-class online videos. However, only one study investigated the pre-class vs. post-video quiz outcomes.21 Christiansen et al. investigated student preferences and knowledge gains for online pre-class quizzes vs. in-class pre-class quizzes.21 The authors found that most students preferred in-class quizzes. It was also found that students performed worse with take-home quizzes. Feedback from students in that study indicated that online quizzes de-incentivized attendance, video-watching before class, and decreased engagement with the videos. In many studies, authors often point to the importance of using post-video quizzes to motivate students to watch videos. However, there are limited formal studies on the outcome of the quizzes on student engagement and learning.

10.1.3 In-class Activity—Student Response Systems Student response systems were implemented in many large flipped organic chemistry courses as a formative assessment to provide “just-in-time” instruction for students.6–8,14,16–18,20,22,23 A few studies have examined outcomes and perceptions of student response systems in the organic chemistry flipped classroom. Mooring et al. found approximately 92% of students agreed that in-class clicker questions helped them better understand the course material.7 Flynn examined students' comments on course evaluations and found that some students thought clicker questions enhance learning and problem solving while giving proof of their real-time comprehension.6 Beyond student perceptions, it is not clear how these response systems impact students’ learning in organic chemistry courses.

10.1.4 In-class Activity–Group Work In many flipped classrooms, group problem-solving was the most used in-class activity. Group activities in flipped classrooms involved worksheets done as group work,6 peer assessment,13 group quizzes,22 process-oriented

Chapter 10

164 17

guided inquiry learning (POGIL), or peer-led team learning (PLTL).7,8 Group activities occurred in small groups of three to five students with questions developed by the instructor or textbook publisher's repository. Groups were randomly put together,7,16,18 selected based on complementary student characteristics,5,10 or selected based on POGIL roles.17 Many studies discussed students' positive comments regarding group work. For example, Flynn reported that students left comments in course evaluations that the in-class group work helped them prepare for exams.6 One study reported that students consistently gave high ratings to in-class group problem-solving activities.18 However, others report no statistically significant exam scores for students completing group work in the flipped classroom compared to the students in the traditional lecture with collaborative group learning.9 However, there were indications that students in the flipped collaborative condition show higher ability to reason about concepts than the traditional lecture collaborative condition.9 When exploring why students enjoyed group work, a few common trends were highlighted across several studies. Muzyka and Christiansen both reported that students enjoyed explaining and teaching information to peers.13,20 Similarly, using survey responses and focus group comments, Shattuck found that students appreciated the in-class problem sets, writing answers on the board, and felt that they had more time to ask questions in class.10 Fautch, through instructor observations, reported a student group debating back and forth on correct versus incorrect attributes of the arrows involved in the problem.5 Students in this study expressed that explaining concepts to others made it easier for them to remember the information, and consolidate their understanding of material.5 A few studies have utilized evidence-based peer-led group discussions in the organic chemistry flipped classroom environment.7,8,22,24 These studies examined the facilitation of small group discussions and activities through PLTL for large enrollment organic chemistry flipped courses (O-Flip-PLTL). PLTL replaced a typical lecture session with small groups ranging from 10 to 19 students per leader. In most studies, peer leaders received training on how to facilitate the sessions.7,8,24 One study comparing O-Flip-PLTL with traditional lectures showed better student performance for the O-Flip-PLTL students.23 Most studies found differences in students' attitudes, motivation, and course retention outcomes instead of statistically significant grade performance. In general, studies using assessment instruments such as the Attitude toward the Subject of Chemistry Inventory (ASCI),7 other survey items,25 and open-ended comments7 have found that most students responded positively to O-Flip-PLTL. Another study specifically investigated student motivation in flipped vs. lecture organic chemistry courses.8 In this study, motivation was defined as a multidimensional construct along a continuum of amotivation (or no motivation) to extrinsic to intrinsic motivation. The researchers found that students were less motivated in an O-Flip–PLTL instructional environment.8 Specifically, they uncovered a significant and negative correlation between

Flipped Classrooms in Organic Chemistry—A Closer Look at Student Reasoning

165

amotivation and exam grades. This can also explain the observed outcome of lower failure/withdrawals and increased pass rates frequently reported for other O-Flip PLTL courses. Several studies have also highlighted increased course retention of O-FlipPLTL, and the data suggest that O-Flip-PLTL promotes a more uniform, less dichotomous grade distribution, particularly with students underrepresented in STEM.19 McCollum and co-authors took an in-depth look at social interactions in the O-Flip-PLTL classroom.22 The authors postulate that two factors were essential to deepening students' conceptual understanding and developing problem-solving skills: peer–leader support and student–student interaction. They also found that numerous students identified the formation of peer–peer relationships as a key benefit of the flipped classroom. Students expressed the desire to contribute to the team, support peers, and contribute to peer feedback, thus allowing students to self-regulate. In addition, the authors observed that as students discussed and debated their understanding with one another they learned how to articulate their thought processes.22 Despite the prominence of group activities in flipped classrooms, there is little known about how students interact during such activities and how these interactions might contribute to positive student outcomes. There is also a need to understand how to best design group work to maximize any benefits to students.

10.2 Student Dialogue in a Flipped Course—A Case Study Most studies around flipped classrooms report the overall outcomes. However, few have focused on how the individual components of the flipped classroom contribute to positive student outcomes. As previously noted, a primary goal of many flipped classroom implementations is to reserve class time for increased student engagement with the content and with each other. As such, group learning is frequently used for students to engage in problem solving and reasoning together. Although studies on flipped organic chemistry courses hinted that group activities may affect student reasoning and argumentation,6,7,13,20,22 there are no studies focused on the nature and quality of the discussion and the discourse moves involved in the co-construction of ideas during group activities. Theories of social constructivism and social cognitive theory26,27 suggest that engaging in problem solving through social interaction is expected to result in student learning gains. These learning theories also note that when students discuss the material with their peers, it helps them clarify ideas, and engage in sense-making about concepts.28,29 However, the efficacy of collaborative learning approaches is dependent on the quality of students' interaction.30 Therefore, simply putting students into groups and encouraging discussions may not result in productive group discourse. The study

166

Chapter 10

by McCollum and co-authors, previously discussed, emphasized the importance of peer–peer relationships in the outcomes of group activities in flipped classrooms.22 Towards this end, we will describe a case study to examine the nature and quality of students' dialogue as they engaged in a group quiz. The group quiz was part of ongoing group-centered activities in a flipped organic chemistry course. We characterize the nature of students' dialogue through the interactive–constructive–active–passive (ICAP) framework31–33 and the quality of the groups' dialogue through the presence of the argumentative elements of opposition and questioning.34,35 We will then discuss the potential impact and implications of the findings of this case study on the use of group discussion to elicit student reasoning in flipped and other active learning organic chemistry courses. This study was approved by the IRB boards of the participating institutions. The following research questions will be addressed:    1. What is the nature of students' discourse during a group quiz as defined by the ICAP framework? 2. What is the quality of students' discourse during a group quiz as defined by the presence of opposition and questioning?   

10.2.1 The ICAP Framework The ICAP framework, developed by Chi,33 proposes that the benefit of learning from collaboration depends on the type of dialogue patterns in which students are engaged. This framework is defined by four overt engagement levels: passive at the lowest engagement level, active, constructive, and then interactive at the highest engagement level. This framework posits that these observable behaviors are indicators of students' underlying cognitive processes and level of learning.30 That is, as students become more cognitively engaged in the material, their learning and understanding is hypothesized to increase. Passive engagement, in the context of group discussion, refers to instances where the student is receiving information but does not exhibit any additional overt behavior.30,33 For example, reading silently or uttering agreement without elaboration. Passive engagement results in minimal understanding since students do not have the opportunity to connect new information to their prior knowledge.30,33 During active engagement, students are typically receiving information or repeating what is already in the given learning material.30,33 Some examples of active engagement include reading the question prompt out aloud or repeating what is presented by peers in the group activity. Active dialogue typically results in shallow understanding since the student has an opportunity to integrate new knowledge with related prior knowledge, but it is limited. In constructive dialogue, students introduce new knowledge to the group based on their own reasoning and beyond what is presented in the activity

Flipped Classrooms in Organic Chemistry—A Closer Look at Student Reasoning

167

or by other peers. The constructive dialogue pattern is associated with deep understanding since new information is activated and integrated with prior knowledge and gives rise to new understandings. Finally, interactive engagement occurs when a student's contribution is in response to another peer's input. In this case, an exchange of ideas occurs between peers and is, therefore, co-constructed knowledge.30,31,33 Interactive dialogue leads to the deepest understanding and results in new understanding that would not have been generated by students individually.33 The ICAP framework can provide an indication of the level of learning expected from students' overt behaviors and interactions. However, to further explore students' reasoning and their contribution to the dialogue, we also coded evidence of the quality of argumentation. The coding of both ICAP and argumentation quality provides a more holistic and comprehensive understanding of the discussion and students' overall contribution to the dialogue.

10.2.2 Argumentation and Student Reasoning in Organic Chemistry Argumentation plays an important role in students' learning of scientific concepts and is central to the process of thinking and scientific reasoning and development of conceptual understanding. Few studies in organic chemistry examine student reasoning through argumentation. Most studies have involved interviews with students to investigate their conceptual understanding,36 or analyzing students written responses.37 Pabuccu and Erduran have published work on argumentation in organic chemistry.34,35 In one publication, they examined how pre-service teachers engage in argumentation about conformational analysis of alkanes in a group activity.35 The authors identified practical difficulties in using Toulmin's argumentation framework when distinguishing between data and warrants or between warrants and backings. Therefore, they derived a method to determine the quality of argumentation by counting the occurrences of opposition and the strength of the rebuttal that followed. The authors argue that rebuttals indicate the quality, level of sophistication, and complexity of the argument.35 Rebuttals in which the counterclaims were unrelated were characterized as low-level argumentation. In contrast, if the rebuttal was in direct reference to data, warrants, or backings offered by the opponent, they considered it as higherlevel argumentation. We also recognize that question-asking by students can also invoke productive argumentation.38,39 We searched for the presence of opposition and question-asking throughout the groups' discussion as an indicator of the quality of the discourse.

10.2.3 Course Context and Participants This study includes organic chemistry students at a primarily undergraduate institution in the southeast United States. All students in the groups identify as female and of African descent. In this flipped course, students viewed

168

Chapter 10

lecture videos, accessed practice problems, and completed assessments independently through the institution's learning management system. Students were also provided with a slide deck for each chapter, instructor-developed reading guides, and problem sets for class discussions. Students completed an online pre-quiz for each topic consisting of ten to fifteen multiple-choice prompts that varied in difficulty. Group activities were an integral part of every 50 min class meeting. In the first 5–10 min of a typical class period, the instructor answered student questions about the previous class period. This process of content delivery spanned 15–20 min. Students used the think–pair–share model40,41 to complete the problem set during the class period. At the end of a topic, students completed a post-quiz to assess their mastery of related concepts. Students typically remained in the same groups throughout the semester.

10.2.4 Group Quiz Format During group quiz sessions, the students could not use supplemental information, but were required to discuss the prompts with group members to arrive at a solution. Students were given the topic before the quiz and were expected to review the relevant concepts. The instructor was available if the group needed clarification on the quiz prompts. The quiz discussed herein is related to alkane conformations. This topic required students to process structural information relating to the threedimensional spatial relationship of groups or atoms represented by the rotations around a carbon–carbon single bond. Such structural information is typically depicted in a Newman projection of the molecule. Students responded to a series of prompts to demonstrate their ability to correctly draw staggered and eclipsed conformations and compare the relative stability of two molecules based on their conformations.

10.2.5 Data Collection and Analysis Each group was video and audio-recorded using overhead cameras set at a fixed angle. iPads equipped with the screen capturing app Explain Everything42 were used to record group responses to the quiz. The recordings in which all group members were visible on camera, and had clear audio were chosen for analysis. Three groups met these criteria over two quizzes. Here we will discuss two groups. Group A, which had three students with pseudonyms, Ashley, Candice, and Tiffany and Group B, which had four students with pseudonyms, Brittany, Dawn, Farrah, and Jasmine. The recordings were transcribed, and the transcripts were arranged by speaking turn, with each line of the transcript representing each time a different student begins to speak. Each speaking turn in the dialogue was coded as interactive, constructive, active, or passive using the definitions indicated by the ICAP framework.30 If the student's speaking turn was not related to the group quiz, it was coded as “off-task.”

Flipped Classrooms in Organic Chemistry—A Closer Look at Student Reasoning

169

The transcripts were also coded for any instances of opposition. Each instance of opposition was also rated for the quality of argumentation based on the criteria by Pabuccu and Erduran.35 We searched the transcripts to find words or phrases that indicated one group member disagreed with another. We also observed that the students used questioning as a subtle form of opposition to further probe a peer's ideas.38,39 The coding for opposition and level of argumentation was first collaboratively coded by SG and SRM then another section of the transcript was coded separately by SRM and SG to establish reliability of the codes. The coding of opposition and level of argumentation was continued iteratively with discussion between SRM and SG until 100% agreement was reached. ICAP coding continued until satisfactory interrater reliability was reached between coders. A Cohen's kappa for interrater reliability for ICAP coding was calculated as 0.73, which indicates a moderate to good agreement between coders.43

10.3 Findings In this section, we will discuss the findings from the case study to compare the dialogue patterns between two groups (A and B) on one prompt from quiz 2. We will provide a summary description of each group's dialogue. Then compare the two groups on the nature of students' discourse using the ICAP framework and students' reasoning through instances of opposition and the quality of argumentation. Each group responded to the following quiz prompt: Explain the stability of the least stable eclipsed conformations of butane defined for the rotation around the C2–C3 axis in comparison to that of octane defined for the rotation around the C4–C5 axis.

10.3.1 Group A Summary The dialogue between students in Group A on this prompt consisted of only 11 speaking turns (see Table 10.1). The group spent the first half of their conversation trying to determine what they needed to do to address the prompt or what the prompt was asking them to do (lines 1–5). They seemed to struggle to decipher the meanings of less and more stable as it related to Newman conformations (lines 2–4). Also, Candice determined from the beginning of the discussion (line 1) that the group's approach to answering the prompt should be through words rather than drawing. The other group members did not disagree with this suggestion. Tiffany introduced the first piece of information that was not explicitly mentioned in the prompt (line 7) in response to Ashley's request for why they should be looking for the least stable (line 6). Tiffany's idea was that energy may be related to when “things” are close to each other, and that energy is used to “break things apart.” Ashley follows up on Tiffany's idea to clarify that when the atoms are close together, they are high in energy. In line 9,

Chapter 10

170

Table 10.1 Speaking turns for Group A's responses to the quiz prompt. ICAP codea

Speaking turn

A

1. Candice: (reads prompt aloud). 4 carbons, 8 carbons, I think more of us explaining, than drawing 2. Ashley: So, were just explaining why that is? the stability, it's less stable in butane than in octane - that's what it says 3. Tiffany (Scribe): So, you're saying butane is more, you said it's more stable?  4. Ashley: That's what it said. No, it's least 5. Tiffany: (rereads the prompt and refers to the previous prompt about comparing the least and most stable conformation of heptane). That's what the question is saying. So basically, in the previous question it's on top of each other (meaning eclipsed) 6. Ashley: I don't know why 7. Tiffany: Well like, it's because it needs more energy to … because I know it takes a lot of kilojoules, to like, for something break, break apart, or it's like how they're close to each other  8. Ashley: When they're close together, they're high in energy, but they wanna be low in energy  9. Tiffany: Yeah. So can we say that for butane the methyl groups may be closer to each other than that of octane (Tiffany scribes this on iPad). 10. Tiffany: So, is that a good enough answer? 11. Ashley: I think it is 12. Tiffany: Because I don't know what else it would be

A A A A

P C I I A P P a

ICAP Codes: NP = negative passive, AP = active passive; A = active, C = constructive, I = interactive.

Tiffany incorrectly states that the groups in butane are closer together than the ones in octane. Unfortunately, this is where the group ends their discussion. Although Tiffany did not articulate all the details regarding the groups being closer together, this statement could have generated additional ideas about how group size in butane and octane may affect the energy and stability of the Newman projection. They also did not make any drawings of the Newman projections during their discussion to further clarify their understanding of the concept. The group concluded their work by writing up their answer as shown in Figure 10.1. Their written answer to the prompt does not explain or compare the differences in the eclipsed conformations of butane and octane. Overall, the group struggled to understand what was required and spent most of their time trying to figure things out. It is possible that the members of the group did not have the appropriate prior knowledge to address the prompt sufficiently. We can also observe the relative contributions of each student in Group A to the overall dialogue. The three group members did not contribute equally. Ashely and Tiffany each contributed about 45% of the group's dialogue, and Candice only had one speaking turn in line 1 as she read the prompt to the group and suggested a strategy for answering the question.

Flipped Classrooms in Organic Chemistry—A Closer Look at Student Reasoning

171

Figure 10.1 Group A's final answer to the quiz prompt (this is the group's original work).

10.3.2 Quiz 2, Prompt 5—Group B Group B's dialogue for the same prompt was almost double in length to that of Group A (Table 10.2). This group initially worked independently for about two minutes before any discussion began. Farrah started the group off by correctly stating that the least stable conformation of butane would be an eclipsed conformation (line 2). She then continued her line of reasoning on line 4 to say that the least stable of both butane and octane was needed in their response. Brittany simply agreed with Farrah each time (lines 3 and 5) but there was no audible response from the other two group members—Jasmine and Dawn. The conversation that followed from lines 8 through 24 is composed almost entirely of co-constructive dialogue between Brittany and Farrah. Their discussion allowed them to think more carefully about the problem as they probed each other about their input. For example, on lines 19 through 25, Brittany and Farrah have an exchange about the end groups (CH2 or CH3) for the Newman projection drawings. On line 19, Brittany questions whether the end group for octane should be CH2. Farrah responds (though incorrectly) on line 20 to explain why the end groups should be a CH2 group. It is noted that the Newman projection of octane offered by the group is incorrect (see Figure 10.2), and there is not a significant size difference between CH3 and CH2. However, despite the flaw in the drawing, the group has a course of dialogue that allows them to surmise the relationship between the size of interacting substituents and the resulting energy cost (lines 21–23). They correctly rationalized that the larger interacting groups have a higher energy cost than smaller interacting groups. In the end, Brittany dictates to the scribe a summary of the group's final answer (line 23), which includes drawings. Group A was composed of four members. However, only two of them contributed to the group's response to this prompt. The video recording of the group showed that one of the group's members was the scribe and

Chapter 10

172

Table 10.2 Dialogue for Group B discussion of the quiz prompt. a

ICAP code Speaking turn

A C P A P C I

A A A A A I I I P I I I I I A A

a

1. Brittany: (reads prompt out loud) (Students work independently on paper for approximately 2 minutes) 2. Farrah: So, the least stable of the butane will be an eclipsed of the butane, right? 3. Brittany: Yeah 4. Farrah: So, it's the least stable of both 5. Brittany: Yeah (Students work independently on paper for 30 seconds) 6. Brittany: The structures will pretty much look the same, but we just have to … it's probably something that has to do with how big it is 7. Farrah: The only difference I have is just the groups that are on it, like butane are ethyl groups … I mean methyl groups and then for octane I just gave them CH2's like for heptane, but still [carbons] 4 to 5 8. Brittany: Is it methyl groups for that one? 9. Farrah: For butane? 10. Brittany: Yeah, it just says for carbons 2 and 3 so I was looking at the middle ones 11. Farrah: For butane it still should be that? 12. Brittany: CH3? 13. Farrah: Yeah this (pointing to paper) is [carbon] 2 and [carbon] 3, if you're looking at what's on the ends … these are just methyl groups 14. Brittany: I wasn't looking at the ends. I was looking at the carbons 2 and 3, but we need to look at the end? 15. Farrah: Yeah, that's gonna determine like what's up and what's down [on the Newman projection] 16. Brittany: Right, you're right 17. Farrah: So yeah … just what we were doing before, see the butane one looks like this, base it off the two ends 18. Brittany: So, for octane it will be the CH2? 19. Farrah: Yeah, it literally looks the same as heptane, it's just a CH2, CH2, up and down 20. Brittany: So … wouldn't the butane have the higher energy with the CH3 and therefore be more stable? 21. Farrah: Yes, CH3s are worth more than CH2s I would think. What do you call that, not cost, their overall energy cost? 22. Brittany: Yes! the energy cost is greater for the CH3. (Brittany restates the integrated form of the answer.09) 23. Brittany: (dictates to the scribe): I would write the Newman projection. Ok. So. For butane. I would Just write the Newman projection. So, we are drawing eclipsed conformations. So for butane the two overlapping would be the CH3 groups, but for butane CH2 groups are overlapping. CH3 groups have more energy. Like so, the energy cost for these to overlap are greater than one with CH2. And you know for molecules, the higher the energy the less stable it is because they are always trying to get to the lowest energy possible. So that's why this is the least stable because the has the most energy.

ICAP Codes: P = passive; A = active, C = constructive, I = interactive.

Flipped Classrooms in Organic Chemistry—A Closer Look at Student Reasoning

173

Figure 10.2 Group B's response to the quiz prompt (this is the group's original work).

therefore, may not have contributed as much to the dialogue. The other student was actively writing on paper throughout this discussion. For both students, their body language suggested that they were actively listening to the conversation, even though they did not contribute to the group's response to this prompt.

10.3.3 ICAP Analysis—Comparison of Group A to Group B Group B spent almost half of their dialogue in the interactive/constructive mode. Whereas for Group A only 25% was interactive and constructive (see Figure 10.3). Group A's dialogue can be characterized as primarily active/ passive since most of their dialogue (75%) focused on restating the question and repeating information that was already given in the prompt, but limited portions of the dialogue were the sharing of new ideas by individual students (constructive) or co-construction of ideas among members of the group (interactive). Interactive and constructive dialogue leads to the deepest level of learning, while active and passive dialogue leads to shallow learning outcomes.32 Although Group B's dialogue was dominated by two of the four group members, Farrah and Brittany, their discussion was more likely to have generated deeper learning gains for all the group members since there were several opportunities to expand their knowledge through incorporating peers' reasoning with their own and co-constructing their understanding together.

Chapter 10

174

Figure 10.3 ICAP comparison for Group A vs. Group B for the quiz prompt. ICAP codes: A + P = active and passive, I + C = interactive and constructive.

10.3.4 Argumentation—Comparison of Group A to Group B For Group A, there are no instances of opposition and therefore no argumentative elements present in the discourse. For Group B, there are no direct instances of opposition. That is, there are no instances in which a student says: “No, that is not correct” or “I do not agree”. However, there are instances in which Brittany presents oppositions to Farrah's claims by posing them as questions (line 8, 11, 13 and 15). From lines 8 through 19, Farrah and Brittany have an interactive discussion on how the Newman projections of butane and octane should be drawn. For example, on line 7, Farrah suggests that the end group on butane's Newman projection should be methyl groups and for octane in should be CH2 groups (see the group's final answer in Figure 10.2). Brittany seems to oppose in asking if it should be a methyl group for butane. However, she does not give any further reason as to why she was unsure. Brittany follows up with an explanation in line 10 that she is looking at the groups attached to carbons 2 and 3. Farrah does not seem convinced and asks: “For butane it still should be that?” Farrah uses her drawing (line 13) to show that there are methyl groups on the ends and Brittany agrees with this in line 16. Brittany and Farrah go through a similar but shorter exchange for octane as Farrah tries to convince Brittany that the end groups for octane should be CH2 (lines 18–19). The exchange between Brittany and Farrah forces them to further explain their positions.

10.4 Conclusions and Implications An analysis of only the final answers presented by the two groups may indicate that neither group was successful in answering the question. However, an analysis of the dialogue between students of Group A and Group B

Flipped Classrooms in Organic Chemistry—A Closer Look at Student Reasoning

175

tells a different story. A focus on the process of how students in the group co-constructed knowledge seems more insightful than their final answer. Group A primarily used an active dialogue, which according to Chi's ICAP framework would produce shallow learning gains. Additionally, students in this group had no instances of opposition and is indicative of a low-quality dialogue. It is unlikely that this dialogue generated significant learning gains for the students in this group. For Group B, half of the conversation was interactive or constructive and argumentative elements were evident through questioning. Given that cooperative group activities are a common component of flipped classrooms, it is critical that chemistry educators pay attention to how these activities are designed and if they are eliciting productive types of discourse. The findings of the case study suggest several aspects of group activities that need attention:

10.4.1 Scaffolding Questions to Promote Argumentation Scaffolding is needed for students to ask the right types of questions. Students' questions foster critical dialogue, thinking and brainstorming towards a group consensus and is a key to orchestrating argumentation.38,39,44 Questionasking is also a metacognitive tool that can broaden the nature and frequency of argumentation patterns.45 In our analysis of Group B, we observed Farrah and Dawn's exchange through questioning (see Table 10.2) was able to foster an extended stream of co-constructive dialogue. Also, Group A struggled to understand what the quiz prompt was asking them to do, and the inclusion of scaffolding questions may have helped the group move forward in a more productive way. Research by others suggests that supporting students in question-asking could stimulate them to engage in meaningful argumentation.39,45 Therefore, conditions to foster productive collaborative discourse include having students pose questions and respond to each other are critical. Questions such as: “What is the evidence to support my view?” can help scaffold the question-asking process. We also observed that group B spent two minutes thinking before they began to discuss the problem, and this seemed to generate additional discussion. One study suggests that students brainstorming individually about what questions they have and then verbalizing them out loud can help other students to be more aware of what they did and did not understand. Students often have difficulty in disagreeing with a peer's ideas. Students try to be supportive instead of offering disagreement.46 Therefore, providing students with prompts that appear as part of the assignment, and perhaps less confrontational, can be useful. Asterhan & Schwarz47 used the following prompts to promote opposing ideas: “Ask your partner a question about an utterance that is unclear” and “Describe opposite ideas against your partner's idea or claim”. These prompts foster argumentation that could facilitate more productive interaction and reasoning among groups.

176

Chapter 10

10.4.2 Group Composition and Roles Group B operated at higher ICAP levels and had at least two students who frequently interacted in interactive/constructive dialogue. There were some groups in which students did not contribute or had limited contributions. We observed that in Group B, the scribe was not actively discussing the problem but actively listening to record the answer. Therefore, it may be important to assign students group roles so that each student is responsible for a task. Similar ideas are expressed in the formation of groups for POGIL activities.17,48 These roles can include Encourager, Facilitator, Spokesperson, Scribe, Quality Control, Process Analyst, Manager, Recorder, Reader, or Materials Manager. Students can then rotate roles so that each member can contribute to the group in diverse ways and foster accountability.49

10.4.3 Incorporating Student Observations in Assessment of Group Activities This case study reveals the importance of observing students during group activities. Instructors or a designated assistant should consider observing students during group work. These recordings can be audio and/or video recorded, or field notes made during student discussions. The insights gained from these recordings can be invaluable in helping instructors decide how to make changes to support group dynamics and provide appropriate scaffolding to further improve student reasoning. Overall, our discussion here supports the need for more research studies that provide further insight into student dialogue during group activities and how the introduction of specific prompts, scaffolding, and group roles have on student argumentation and reasoning in the context of organic chemistry. More research is also needed on the individual components of flipped courses such as video recordings to better enhance how they are implemented to maximize student learning and success.

Acknowledgements The authors thank Dr Leyte Winfield and Dr Joy Ballard for their contributions to this work. The authors would like to thank the National Science Foundation (NSF) for funding this research (#1625414).

References 1. G. Bhattacharyya, Chem. Educ. Res. Pract., 2014, 15, 594–609. 2. L. T. Tien, V. Roth and J. Kampmeier, J. Res. Sci. Teach., 2002, 39, 606–632. 3. S. Freeman, S. L. Eddy, M. McDonough, M. K. Smith, N. Okoroafor, H. Jordt and M. P. Wenderoth, Proc. Natl. Acad. Sci. U. S. A., 2014, 111, 8410–8415.

Flipped Classrooms in Organic Chemistry—A Closer Look at Student Reasoning

177

4. E. J. Theobald, M. J. Hill, E. Tran, S. Agrawal, E. N. Arroyo, S. Behling, N. Chambwe, D. L. Cintrón, J. D. Cooper and G. Dunster, Proc. Natl. Acad. Sci. U. S. A., 2020, 117, 6476–6483. 5. J. M. Fautch, Chem. Educ. Res. Pract., 2015, 16, 179–186. 6. A. B. Flynn, Chem. Educ. Res. Pract., 2015, 16, 198–211. 7. S. R. Mooring, C. E. Mitchell and N. L. Burrows, J. Chem. Educ., 2016, 93, 1972–1983. 8. Y. Liu, J. R. Raker and J. E. Lewis, Chem. Educ. Res. Pract., 2018, 19, 251–264. 9. M. A. Rau, K. Kennedy, L. Oxtoby, M. Bollom and J. W. Moore, J. Chem. Educ., 2017, 94, 1406–1414. 10. J. C. Shattuck, J. Chem. Educ., 2016, 93, 1984–1992. 11. J. C. Shattuck, in Active Learning in Organic Chemistry: Implementation and Analysis, ACS Publications, 2019, pp. 167–186. 12. M. K. Seery, J. Chem. Educ., 2015, 92, 1566–1567. 13. M. A. Christiansen, J. Chem. Educ., 2014, 91, 1845–1850. 14. M. D. Casselman, K. Atit, G. Henbest, C. Guregyan, K. Mortezaei and J. F. Eichler, J. Chem. Educ., 2019, 97, 27–35. 15. C. Cormier and B. Voisard, Front. ICT, 2018, 4, 30. 16. R. D. Rossi, J. Chem. Educ., 2015, 92, 1577–1579. 17. M. P. DeMatteo, in Active Learning in Organic Chemistry: Implementation and Analysis, ACS Publications, 2019, pp. 217–240. 18. L. A. Morsch, in The Flipped Classroom Volume 1: Background and Challenges, ACS Publications, 2016, pp. 73–92. 19. J. Mutanyatta-Comar and S. R. Mooring, in From General to Organic Chemistry: Courses and Curricula to Enhance Student Retention, ACS Publications, 2019, pp. 145–157. 20. J. L. Muzyka, J. Chem. Educ., 2015, 92, 1580–1581. 21. M. A. Christiansen, A. M. Lambert, L. S. Nadelson, K. M. Dupree and T. A. Kingsford, J. Chem. Educ., 2017, 94, 157–163. 22. B. M. McCollum, C. L. Fleming, K. M. Plotnikoff and D. N. Skagen, Can. J. Scholarship Teach. Learn., 2017, 8, n3. 23. B. G. Trogden, J. Chem. Educ., 2015, 92, 1570–1571. 24. C. O. Welder, in Active Learning in Organic Chemistry: Implementation and Analysis, ACS Publications, 2019, pp. 119–148. 25. K. S. Rein and D. T. Brookes, J. Chem. Educ., 2015, 92, 797–802. 26. A. Bandura, Am. J. Health Promot., 1997, 12, 8–10. 27. N. Mercer, in Language and the Joint Creation of Knowledge, Routledge, 2019, pp. 156–186. 28. A. R. Cavagnetto, Rev. Educ. Res., 2010, 80, 336–371. 29. M. C. Wittrock, Educ. Psychol., 1989, 24, 345–376. 30. Dialogue Patterns in Peer Collaboration that Promote Learning, ed. M. T. Chi and M. Menekse, American Educational Research Association, Washington, DC, 2015. 31. M. T. Chi, S. Kang and D. L. Yaghmourian, J. Learn. Sci., 2017, 26, 10–50.

178

Chapter 10

32. M. T. Chi and M. Menekse, Socializing Intelligence through Academic Talk and Dialogue, 2015, pp. 263–274. 33. M. T. Chi and R. Wylie, Educ. Psychol., 2014, 49, 219–243. 34. S. Erduran, in Argumentation in Chemistry Education: Research, Policy and Practice, ed. S. Erduran, Royal Society of Chemistry, London, 2019, pp. 228–246. 35. A. Pabuccu and S. Erduran, Int. J. Sci. Educ., 2017, 39, 1154–1172. 36. D. C.-R. de Arellano and M. H. Towns, Chem. Educ. Res. Pract., 2014, 15, 501–515. 37. J. M. Deng and A. B. Flynn, Chem. Educ. Res. Pract., 2021, 22, 749–771. 38. C. Chin and J. Osborne, J. Res. Sci. Teach., 2010, 47, 883–908. 39. C. Chin and J. Osborne, J. Learn. Sci., 2010, 19, 230–284. 40. R. J. Marzano and D. J. Pickering, Building Academic Vocabulary: Teacher's Manual, ERIC, 2005. 41. M. Kaddoura, Educ. Res. Q., 2013, 36, 3–24. 42. Explain Everything, interactive whiteboard mobile app, available from: itunes.apple.com, accessed August 2019. 43. F. M. Watts and S. A. Finkenstaedt-Quinn, Chem. Educ. Res. Pract., 2021, 22, 565–578. 44. D. Kuhn, Educ. Res. Rev., 2009, 4, 1–6. 45. J. Osborne, S. Erduran and S. Simon, J. Res. Sci. Teach., 2004, 41, 994–1020. 46. E. M. Nussbaum, C. M. Kardash and S. E. Graham, J. Educ. Psychol., 2005, 97, 157. 47. C. S. Asterhan and B. B. Schwarz, Cogn. Sci., 2009, 33, 374–400. 48. POGIL, An Introduction to Process Oriented Guided Inquiry Learning for Those Who Wish to Empower Learners, Stylus Publishing, Sterling, VA, 2019. 49. A. Moon, C. Stanford, R. Cole and M. Towns, J. Chem. Educ., 2017, 94, 829–836.

Chapter 11

Systemic Assessment Questions as a Means of Assessment in Organic Chemistry Gulten Sendur*a a

Dokuz Eylul University, Buca Faculty of Education, Department of Mathematics and Science Education, Izmir, Turkey *E-mail: [email protected]

11.1  Introduction As a sub-branch of chemistry, organic chemistry touches everyone’s life profoundly. The basic concepts and principles of organic chemistry are essential to many areas of research. The content of an organic chemistry course is composed of specific concepts and principles that learners may be encountering for the first time. Learners of organic chemistry must have a deeper understanding to meaningfully relate to, and interpret, the concepts and principles that are mostly encountered in organic chemistry such as resonance, hyperconjugation, nucleophile–electrophile, the steric effect, along with the topics learned at a general chemistry level such as acid–base, thermodynamic stability, polarity, hydrogen bonding and electronegativity. Students face challenges in organic chemistry that involve not only understanding concepts and principles, but also interpreting how structural changes in molecules can affect reaction types, mechanisms and products formed. Mechanistic reasoning is also a vital part of the learning process as   Advances in Chemistry Education Series No. 10 Student Reasoning in Organic Chemistry: Research Advances and Evidence-based Instructional Practices Edited by Nicole Graulich and Ginger Shultz © The Royal Society of Chemistry 2023 Published by the Royal Society of Chemistry, www.rsc.org

179

180

Chapter 11

a means of understanding the nature of reactions in organic chemistry. It is through these processes that learners can meaningfully and sustainably internalize reactions in organic chemistry without memorizing.1,2 In guiding students into a meaningful understanding of organic chemistry, the factors affecting electron density, such as inductive and mesomeric effects, as well as concepts specific to covalent bonds such as radicals, carbocation, and the symbolic representation of electron flow, must be explained. As Graulich (2015)1 emphasizes with his iceberg analogy, to understand the molecular representations and reactions in the visible part of the iceberg, the many concepts and theories remaining under the iceberg need to be made meaningful. Another challenge for learners in organic chemistry is its unique symbolic language. The mastery of this symbolic language, full of the curved arrows of electron-pushing formalism (EPF), helps students to understand the nature of reactions and make connections between them in a thorough analysis.3,4 These aspects, which we can summarize as the nature of organic chemistry, can be challenging for students at different stages of their education— from high school to university. Students may be helped in learning about the nature of organic chemistry by applying appropriate strategies in the learning process and by being encouraged to not perceive organic chemistry as a difficult course. It is vitally important for this reason that the classroom environment includes the application of measuring and assessment activities that are appropriate to the nature of organic chemistry and that the learning setting is arranged to support such activities. This chapter will discuss how systemic questions and diagrams can be used to make assessments of learning that are compatible with the nature of organic chemistry. The systemic approach to teaching and learning (SATL), which provides a theoretical framework for systemic questions and diagrams, will first be explained, after which various examples of systemic assessment questions (SAQs) and diagrams in organic chemistry will be given. The types of assessments that can be made with such examples will be reviewed against a background of the literature. The chapter will also provide an example of an SAQs diagram that can be used both in exploring SATL studies and in the organic chemistry classroom.

11.2  T  he Role of Scientific Reasoning Skills in Developing Meaningful Understanding in Organic Chemistry One of the main goals in achieving meaningful learning, as put forward by Ausubel (1968),5 is reaching meaningful understanding. Meaningful understanding requires that chemistry students can use all aspects of their knowledge. Meaningful understanding allows students to make judgments within the scope of their knowledge of chemistry, to form associations and cause– effect relationships, draw inferences, and make estimations regarding chemical processes.6

Systemic Assessment Questions as a Means of Assessment in Organic Chemistry

181

In terms of organic chemistry, it is important for students to make connections between concepts and deal with these connections in an integrated structure to reach a meaningful level of understanding. Reasoning skills play a role in this process as a determinant of the quality of conceptual understanding7,8 and the knowledge acquisition process.9 Reasoning in organic chemistry involves various simultaneous cognitive processes such as balancing the factors in a reaction step, associating concepts with structural representation, and following electron flow with curved arrow notation.1 Students must develop various scientific reasoning skills such as mechanistic reasoning, spatial reasoning and causal reasoning in reaching the level of meaningful understanding needed in organic chemistry. By assessing students’ meaningful understanding in organic chemistry, teachers can assist students in overcoming shortcomings in scientific reasoning skill and avoiding the adoption of alternative concepts.10,11 Utilizing appropriate and applicable tools is useful in assessing levels of students’ meaningful understanding.

11.3  A  ssessment of Students’ Meaningful Understanding in the Context of SATL A variety of different tools—concept maps, interviews, open-ended questions, two-/three-tier conceptual questions—can be used to assess students’ meaningful understanding.12–14 Although such tools can help to determine students’ level of understanding, they can be insufficient in making an in-depth assessment. For example, an open-ended question might ask students to complete the reaction of an organic molecule, but it is very difficult with this type of question to easily make an integrated association between the organic molecules in a chemical reaction. Again, in two- or three-tier conceptual questions, it will not be easy to understand how students formulate the association between reactions since the question focuses more on concepts. Hodges and Harvey (2003)15 drew attention to the importance of using different tools to assess the learning quality of students, especially in organic chemistry courses. Assessment tools will help students to make associations between different classes of molecules, especially in reactions in organic chemistry, and integrate associations to formulate new reactions. A tool that can be used for this purpose is the systemic assessment question and diagram which was developed as a fundamental tool in SATL. Developed by Fahmy & Lagowski (2003)16 based on Ausubel’s theory of meaningful learning and the concept maps developed by Novak and Gowin (1984),17 SATL treats concepts and topics two-dimensionally and in a spatial arrangement where the presentation of topics and their relationships have a central role. The main objective here is to create an interactive system in which learners are clear about all the relationships between fundamental concepts and topics.18 The arrangement of topics in SATL is based on a

182

Chapter 11

Figure 11.1  The  presentation of concepts in the systemic diagram. Reproduced from ref. 16 with permission from American Chemical Society, Copyright 2003.

closed system of concepts in circular form (i.e., the cyclic concept map) that are called systemic diagrams (Figure 11.1). Although the structure of systemic diagrams is like an ontological model in representing knowledge with concept maps, there are structural differences between the two.19 The difference is that the structure of systemic diagrams manifests in the form of a closed system of concepts, whereas hierarchical concept maps mostly include more limited relationships. All relationships between concepts can thus be shown in systemic diagrams.20 When we compare concept maps with systemic diagrams, perhaps the most striking difference is that with diagrams, we are able to show an integrated and multi-faceted relationship between molecules through reactions. For example, while we can show the different addition reactions of alkenes in one part of a systemic diagram related to alkenes, in another part, we can show the synthesis of alkenes from alkyl halides. Furthermore, displaying in another part of the diagram how the alkyl halides occur because of the halogenation of alkanes can teach learners to adopt an integrative perspective about forming an association between these molecules. In short, when we focus on the reactions between organic molecules in systemic diagrams, the diagrams provide much more detailed information compared to concept maps.21 Presenting concepts in closed-cluster form in systemic diagrams not only enables learners to find detailed information, but it also helps them to develop important thinking skills such as decision-making, multidimensional thinking, and forming relationships so that they can organize a conceptual structure.6 When learners develop such high-level thinking skills, this plays a facilitating role in reaching meaningful understanding. It can be said then that learners can reach meaningful understanding with SATL.22 Fahmy (2014)23 has summarized why SATL can be used in learning and teaching processes as follows:    ● Helps students to understand the relationships between concepts in an integrated manner.

Systemic Assessment Questions as a Means of Assessment in Organic Chemistry

183

Engages students in deep learning. ● ● Helps students acquire high-level cognitive skills such as problemsolving and decision-making. ● Offers the fundamental skills of systemic thinking that will enable individuals to analyze the basic components of a system and synthesize these into a meaningful whole.    With these features, SATL is a model that not only provides meaningful understanding, but also helps to assess meaningful understanding. SATL demonstrates that students who have a meaningful understanding in science can relate to both concepts and the relationships between them in a holistic structure.6,22,24

11.3.1  S  ystemic Diagrams and Systemic Assessment Questions Systemic diagrams, the basic teaching tool of the SATL model, have a two-dimensional spatial arrangement in which concepts and relationships between concepts are represented.22 Systemic assessment questions (SAQs) constitute another tool that, displays an arrangement like systemic diagrams, but encompasses fewer concepts and connections. SAQs can be considered as a sub-system of systemic diagrams and can be prepared in various geometric forms—triangular, quadrilateral, pentagonal, and hexagonal—depending upon the number of concepts and inter-conceptual relationships they contain.25 The most fundamental difference between systemic diagrams and SAQs is the number of concepts these structures encompass. Systemic diagrams are more comprehensive in terms of content since they have been developed to point to all related concepts in the relevant chemistry topic. SAQs, however, are more limited. Another important difference is the purpose for which the two tools were developed. When students encounter concepts presented in systemic diagrams for the first time, diagrams act as an instructional tool in the processing of educational content. An important feature of systemic diagrams is that some inter-conceptual relationships are not defined at that moment but are allowed to be completed as the teaching process continues. For example, in Figure 11.2, the reaction and synthesis of alkanes is covered in the systemic diagram related to aliphatic hydrocarbons that was developed by Fahmy and Lagowski (2002)24 in which the related reactions are completed in the diagram. Yet, other chemical reactions with alkenes and alkynes have not been described. The formation of ethyl bromide because of the addition of HBr to ethene, or the formation of ethene because of the elimination reaction of ethyl bromide with a strong base in the presence of alcohol, as indicated by the arrow number 2 in the diagram, will be completed with the teaching of the subject of alkenes. The same is true for reactions numbered 1, 3 and 4, which are related to the synthesis and reactions of alkenes. Similarly, the aim is to complete reactions numbered 5 and 6 related

Chapter 11

184

Figure 11.2  Systemic  diagram related to aliphatic hydrocarbons. Reproduced from ref. 24 with permission from IUPAC, Copyright 2002.

to the reaction and synthesis of acetylene in the diagram and tackle the subject of alkynes. Another important feature of systemic diagrams is that they can include all reactions between molecules. Although the diagram developed by Fahmy and Lagowski (2002)24 was drawn for aliphatic hydrocarbons, the reactions of other organic molecules such as alcohol, carboxylic acid, alkyl halide and aldehyde that are associated with the synthesis and reactions of aliphatic hydrocarbons are also shown (Reactions 7, 8, 9 and 10). In this respect, it is possible to form a meaningful structure by establishing multifaceted relationships between systemic diagrams and concepts in organic chemistry. While SAQs were developed to assess students’ understanding, they are usually applied after students become familiar with the educational content.21 It is very rare for students to encounter concepts that they have not learned in SAQs. It is therefore appropriate to use SAQs for the purpose of formative assessment. However, there are cases where SAQs are used as an instructional tool.21,26 In the context of SATL, Fahmy and Lagowski (2014)27 proposed different types of SAQs. These are “Systemic Multiple-Choice Questions [SMCQs], Systemic True, False Questions [STFQs]”, “Systemic Matching Questions [SMQs], Systemic Sequencing Questions [SSQs]”, “Systemic Synthesis Questions [SSynQs], and Systemic Analysis Questions [SAnQs]”. A question that can serve as an example of Matching on Pentagonal Systemics, which is a

Systemic Assessment Questions as a Means of Assessment in Organic Chemistry

185

type of SMQ, and an example of Systemic Synthesis Questions [SSynQs] are shown in Figure 11.3. In the question prepared in the SMQs format shown in Figure 11.3, students are expected to complete the diagram by considering both the reactions of the related molecule and the products formed in these reactions. The question type in the SSynQs style in Figure 11.3 is at a higher cognitive order than SMQs. In this question type, involving the structural properties of molecules, students must form intermolecular reactions in their integrity. The student is asked to name the molecules and use structural formulas to produce the diagram while no information (reagents, reaction conditions) is provided, making this kind of question more challenging that other types. These original versions of SAQs developed by Fahmy and Lagowski have been revised by some researchers who turned them into structured and

Figure 11.3  Examples  of SMQs and SSynQs. Reproduced from ref. 27 with permis-

sion from the Federation of African Societies of Chemistry, Copyright 2014.

Chapter 11

186

Figure 11.4  A  three-stage process to successfully complete SAQ diagrams. Adapted from ref. 6 with permission from Springer Nature, Copyright 2014.

semi-complete systemic diagrams (fill-in the blanks SAQs).11,28 Vachilous et al. (2014)6 stated that in a properly designed SAQ diagram, a meaningful structure can be created from interrelated conceptual subsystems and these structures encourage students to use higher-order thinking skills (HOTS) such as analysis and synthesis. As learners complete the diagram, their analysis of the basic components of molecules, reagent/reaction conditions and the direction of the reaction in the diagram and then their achieving an integrated and meaningful synthesis of relationships between these components will make an important contribution to the development of their HOTS. SAQ diagrams thus provide teachers with a detailed perspective on students’ understanding, knowledge integration and reasoning skills, where traditional assessment methods are limited. For students to successfully complete SAQ diagrams, they may follow the three-stage process summarized in Figure 11.4, in which HOTS such as analysis and synthesis are used.6 Monitoring the analysis of a SAQs diagram, which comprises many subsystems, may help learners to make more sense of the diagram. It can therefore be said that SAQs come to the forefront as a tool that can be used to assess students’ meaningful understanding levels and their highlevel thinking skills such as systems thinking, especially in the context of the organic chemistry courses.21,22,29

11.3.2  Assessment of SAQs Different methods can be pursued in the assessment of SAQs, depending on the purpose of the assessment. For instance, if the main purpose is to evaluate whether the content in the SAQs diagram is correct, scoring can be applied on the principle that each element in the diagram will contribute equally to the creation of the conceptual whole.22 It is possible then to calculate students’ total scores on the diagram by assigning 1 point to each correct

Systemic Assessment Questions as a Means of Assessment in Organic Chemistry

187

element (structural formula of the molecule, arrow direction, reagents and reaction conditions, arrow direction for the newly drawn reaction) that needs to be completed on the diagram. If the intention is to evaluate students’ conceptual structures with SAQs, a way to demonstrate this is to conduct an analysis on two variables—the size of the conceptual structure and the strength of the conceptual structure.30 While the variable, expressed as the size of the conceptual structure, is the ratio of the number of concepts determined by the student to the total number of concepts in the diagram, the strength of the conceptual structure variable is the ratio of the number of valid relationships determined by the student on the diagram to the total number of relationships. Hrin et al. (2018)30 classified the size of students’ conceptual constructs as “insignificant, small, partial, substantial and complete” and similarly grouped the strength of conceptual constructs as “weak, medium and strong.” For example, when the size of a student’s conceptual construct is between 20 and 40%, this is regarded as “small,” whereas sizes of between 40 and 60% are classified as “partial.” Similarly, when the strength of a conceptual construct is between 33 and 66%, this is categorized as medium, and strong when it is 66–100%. Researchers have taken the combination of the two variables obtained as a basis for developing a scoring rubric and in this context, have classified cognitive constructs as “nominal, functional, structural multidimensional and distinguished multidimensional.” This type of classification not only permits an assessment of students’ conceptual constructs in the light of different criteria, but also offers information on the quality of the cognitive constructs. The use of SAQs makes it possible to assess students in terms of systems thinking levels. To this end, Vachilous et al. (2014)6 developed a rubric based on the distinction, system, relationship, and perspective (DSRP)31 model whereby systems thinking could be analyzed in terms of its fundamental components and its subsystems, forming a synthesis in a whole. These rubric studies systems thinking skills in five stages, assigned points from 0 to 5. (The rubric is included in the online supplement.) Assessing SAQs in the context of systems thinking based on the DSRP model will provide useful information to researchers and teachers about how students can generate relationships between notions and whether they can handle them in a holistic structure.

11.4  R  esearch on Systemic Diagrams in Organic Chemistry Education A look into the studies on systemic diagrams shows that these have been explored in various areas of chemistry—physical chemistry,32,33 biochemistry,34,35 environmental chemistry,36 inorganic chemistry.37 We can say that studies in organic chemistry, like those carried out in other areas, can be divided into two groups.

188

Chapter 11

The aim of the first group of research is to explore students’ meaningful learning/understanding, systems thinking skills and cognitive structures by using SAQs as an assessment tool. One of these studies was conducted by Vachliotis et al. (2011),22 aiming to reveal how effective the SAQs developed in the study were in assessing the meaningful learning of eleventh grade students about organic reactions. In a study with 72 eleventh grade students, the researchers conducted the proceedings in two stages. In the first stage, they used a traditional educational approach for the topic of hydrocarbons, and afterwards administered a test containing SAQs as well as conventional objective questions. In the second stage of their study, the topic of alcohols and carboxylic acids was taught using the diagrams developed according to the techniques of SATL, and then finally, similar to the first stage, a test containing both SAQs and conventional objective questions. The results of the exploratory factor analysis for both stages of the study showed that SAQs played an important role in assessing meaningful learning. It was particularly found that SAQs were more suitable in assessing the meaningful learning of more complex diagrams where there was less guidance. In another study conducted by Vachliotis et al. (2014),6 the authors aimed to investigate whether the SAQs on the topics of classification of organic compounds, IUPAC nomenclature, structural isomers of organic compounds, aliphatic hydrocarbons (alkanes, alkenes, and alkynes) are a valid and reliable tool for assessing students’ meaningful learning and systems thinking skills. Accordingly, two different tests made up of SAQs and objective items were applied to 91 eleventh grade students after the instruction. The results of the factor analysis revealed that appropriately designed objective questions, as with SAQs, could be used to effectively determine the level of students’ meaningful understanding. The study also showed that SAQs could reveal systems thinking skills, on which the DSRP model is based, and that these questions were highly correlated with the objective items used to assess meaningful understanding. The researchers called attention to the fact that systems thinking levels are significantly related with meaningfully understanding the concept relationships in the same science domain. In the study conducted by Hrin et al. (2018),30 student generated SSynQs were used to evaluate and compare the cognitive and conceptual structures of high school students (N = 71) and preservice chemistry teachers (N = 12) in organic chemistry. The researchers prepared four SSynQs related to aliphatic hydrocarbon and halogen derivatives, alcohols, ethers, and aldehydes and asked the students to recreate the reactions between the molecules in a diagram, after which they then made a quantitative and qualitative analysis of the diagrams. At the end of their study, it was found that high school students and pre-service chemistry teachers had good knowledge of the IUPAC nomenclature of organic compounds and their chemical structures (excluding ethers) but many participants had learning difficulties related to bond breaking/formation and reaction mechanisms. It was particularly seen that both student groups took the heterolytic cleavage of the polar H–Cl bond as the first step in the chlorination of alkanes instead of the homolytic cleavage

Systemic Assessment Questions as a Means of Assessment in Organic Chemistry

189

of the nonpolar Cl–Cl bond. It was also observed that the students’ reactions showed that alcohols could be directly converted to alkanes even though the OH group is not a good leaving group. In the second group of studies on SAQs, diagrams were used as a teaching tool, and it was discussed whether this instruction based on SATL influenced students’ achievements and HOTS. Some of these studies were conducted by Fahmy and Laqowski on different student groups. In the study conducted by the researchers on high school students, while the topic of carboxylic acids was treated with systemic diagrams as related to SATL activities in the experimental group, instruction in the control group about relationships between classes of organic molecules was given piece by piece in a linear method that did not permit an integrative approach nor the analysis of all the reactions taking place between classes of organic molecules.16 When the pre- and posttest achievement of the students on both the questions prepared according to the linear and the SATL approach were compared, it was found that the mean scores of the experimental group students were much higher than those of the students in the control group. Another study carried out by Fahmy and Laqowski (2002)24 was conducted with second year undergraduate students taking the organic chemistry course, and similar to the results of the other research, the conclusion was reached in the topic of aliphatic hydrocarbons that the students in the experimental group who had worked with systemic diagrams had much higher total mean scores on both the linear and the systemic question types and also had higher total mean scores on the test compared to the control group who had been instructed with a linear approach. One of the second group of studies on SAQs was conducted by Hrin et al. (2016).21 In the topic of hydrocarbons and halogen derivatives in this study, the scores on the posttest of the third-year high school students who had been instructed with SSynQs were compared with the control group who had had traditional instruction, and it was found that the performance of the experimental group was much better. At the same time, the test that was applied to the students in the same study to determine their performance after the instruction was analyzed by factor analysis. Results showed that the SSynQs, which offered less guidance and differed from examples in the textbooks, were ideal for assessing students’ meaningful understanding. The students’ being required to analyze the problem in the process of answering these types of questions, to form associations, apply critical thinking to estimate the correct solutions, gather knowledge and respond meaningfully, and their use of other similar cognitive processes and skills can be said to be the factors that make the tool effective in terms of assessing the extent of their meaningful understanding. Hrin et al. (2016)26 found in another experimental study like their previous study that their experimental group of high school students who had been instructed on the subject of hydrocarbons and halogen derivatives using SSynQs had a higher mean score than the control group who had been instructed with traditional methods. The study showed in the comparison of

190

Chapter 11

the mental efforts of both groups on the linear and SSynQs, that the experimental group of students spent less mental effort on all types of questions. This indicated that the students in the experimental group more easily solved the questions with less of a cognitive burden, compared to the students in the control group. That is, the experimental group registered higher achievement with less mental effort. The fact that the combination of mental effort and academic achievement, regarded as the productivity of the method of instruction,38 was high in the experimental group, is quite significant in terms of revealing the effectiveness of SSynQs-based instruction. Studies in which SAQs are used as an instructional tool, not only can the effect on students’ meaningful learning levels be examined, but also whether there is any effect on systems thinking skills, which can be evaluated within the scope of HOTS. In this context, in their studies conducted with high school students, Hrin et al. (2016, 2017)29,39 showed that the level of systems thinking of students who receive instruction based on SAQs is higher than students who are taught with a traditional approach. In particular, the difference has been found to be much more pronounced at more complex levels of systems thinking. It is clear that SAQs-based learning helps students to not only define specific reactions unique to a certain class of molecules, but also brings them to a level at which they can identify reactions between different classes of molecules in a complex system. In summary, whether SAQs or SAQs diagrams are used as an assessment or teaching tool, it can be said they are both effective in revealing and developing different skills and achievements, from meaningful learning to cognitive structure and systems thinking skills.

11.5  E  xample of an Activity to Assess Students Meaningful Understanding with SAQs Diagrams in Organic Chemistry Lessons Unlike the studies above, Şendur (2020)11 developed the SAQs diagram presented in Figure 11.5 to reveal how pre-service chemistry teachers can create meaningful relationships between the reactions of aromatic compounds and what kind of learning difficulties they may have in this process. While there are four sub-systems in the diagram, which is a semi-complete and structured form, meaning that it was prepared in a fill-in-the-blanks format, at the end of the diagram completion, interviews were conducted with all of the preservice teachers using the thinking aloud technique. During the interviews, the pre-service teachers were given the diagrams they had completed and asked to explain out loud how they completed them and what kind of relationships they formed between the reactions. The aim of this was to understand the pre-service teachers’ thinking process in identifying the reactions in the diagrams, how they considered the reagent and reaction conditions, and which points they found challenging.

Systemic Assessment Questions as a Means of Assessment in Organic Chemistry

191

Figure 11.5  Aromatic  compound SAQs diagram. Reproduced from ref. 11 with permission from the Royal Society of Chemistry.

While the analysis of the SAQs diagram is performed quantitatively by giving 1 point to each correct item that needs to be completed on the diagram (structural formula of the molecule, the direction of the arrow, the reagents and reaction conditions, the direction of the arrow for the newly drawn reaction), in the qualitative analysis, a coding was created to identify whether these items were completed correctly. The study results showed, according to the score distribution of the pre-service teachers’ SAQ diagrams, that more than half had average scores and the rest had low scores. This was an indication that the pre-service chemistry teachers had some difficulties in defining the reactions of aromatic compounds and establishing meaningful relationships between them. While the qualitative analysis results of both the diagram and the interviews revealed that preservice chemistry teachers were more successful in writing the product correctly in reactions where the reagents and catalysts were known, they had quite some difficulty in writing a reaction that was not specified in the diagram or in writing the reactants and conditions of a reaction. One of the important findings of the research is that preservice teachers tend to write reaction products or reagents without thinking about the mechanism. This is especially evident in the writing of the methyl group as a reactant in the formation of methylbenzene from benzene in the Friedel–Crafts alkylation reaction, and dimethyl ketone as a product of the reaction of benzene with acetyl chloride in the presence of

192

Chapter 11

AlCl3 in the Friedel–Crafts acylation reaction. Similar findings were found in writing the product and reagents of the nitration/sulfonation of benzene. For example, in the nitration of benzene reaction, some preservice teachers wrote down the NO2 group as a reagent, while others attached the SO2 or SO4 groups to the aromatic ring in the sulfonation of benzene reaction. This showed that the preservice teachers did not quite understand the role of the electrophile nor the mechanism of the reaction. Another finding that was striking to see at this point was that in the side-chain oxidation of alkyl benzene, some pre-service teachers did not take the oxidizing agent KMnO4 in the environment into consideration, instead focusing on the OH− ion and indicating alcohol as the product of the reaction. This demonstrates that in fact, the pre-service teachers could not differentiate between reagents and solvents and that they frequently tended to base their interpretations on groups that they were more familiar with. These results are important in terms of demonstrating that while evaluating the meaningful relationships established by students with SAQs diagrams, it is also possible to determine what kind of learning difficulties and alternative conceptions students have.

11.6  Conclusions and Implications In organic chemistry, having limited knowledge of a specific organic molecule family without associating the structural and chemical properties of organic molecule families with each other usually remains inadequate in achieving meaningful learning. It is therefore important to establish how organic molecules are associated with each other. This association process includes not only comparing two or more organic molecules in terms of the functional group they contain, but also interpreting their impact on reactivity, determining the type of interaction between the species, and estimating what the product of the reaction will be. Thus, scientific reasoning skills such as mechanistic and causal reasoning are important complements of this process in organic chemistry. Since understanding reaction mechanisms can be made much easier if causal relationships can be correctly formed, creating systems where cause and effect relationships can be treated from all angles will help students in reaching meaningful understanding. It can be said therefore in this respect that a meaningful understanding level can be achieved in organic chemistry by creating systems in which relationships between basic structures are established in sub-systems under a multiple perspective and then handled in an integrated structure. In particular, the more these types of conceptual sub-systems are organized and related to each other, the deeper will be the understanding of pertinent chemical concepts.25 Such systems are useful not only in achieving meaningful understanding, but also in assessing the meaningful understanding level of students and revealing at what points they face learning difficulties. SAQ diagrams based on SATL are tools that can be used to this end. SAQ diagrams provide a means

Systemic Assessment Questions as a Means of Assessment in Organic Chemistry

193

of creating a meaningful structure by forming multiple-faceted associations between concepts using HOTS such as analysis and synthesis, thus offering the opportunity to assess students’ meaningful understanding. This section primarily focuses on how to use the SAQ and SAQ diagrams in assessing students’ levels of meaningful understanding in organic chemistry. Especially in the context of organic chemistry lessons, we believe that the use of such diagrams will contribute to creating learning environments that will direct students toward meaningful learning rather than rotelearning. We suggest that the use of such diagrams will help students to engage in causal reasoning regarding the topic of reactions. When working with diagrams, it might also be useful to ask students to explain the mechanisms behind the reaction to encourage them to improve their mechanistic reasoning skills. In this context, it might be especially supportive in term of syllabus planning to review the explanations of students who cannot make a sufficient association between molecules on the diagram or who write down the wrong reagent or product, and to examine their explanations in the context of the mechanism of the reactions and how they use the symbolic language.

References 1. N. Graulich, Chem. Educ. Res. Pract., 2015, 16(1), 9. 2. F. M. Watts, J. A. Schmidt-McCormack, C. A. Wilhelm, A. Karlin, A. Sattar, B. C. Thompson, A. R. Gere and G. V. Shultz, Chem. Educ. Res. Pract., 2020, 21(4), 1148. 3. A. B. Flynn and R. B. Featherstone, Chem. Educ. Res. Pract., 2017, 18(1), 64. 4. K. R. Galloway, C. Stoyanovich and A. B. Flynn, Chem. Educ. Res. Pract., 2017, 18(2), 353. 5. D. P. Ausubel, Educational Psychology: A Cognitive View, Holt, Rinehart and Winston, New York, 1968. 6. T. Vachliotis, K. Salta and C. Tzougraki, Res. Sci. Educ., 2014, 44(2), 239. 7. A. E. Lawson, J. Res. Sci. Teac., 2005, 42(6), 716. 8. T. D. Sadler and D. L. Zeidler, Sci. Educ., 2005, 89(1), 71. 9. D. Kuhn, in Blackwell Handbook of Childhood Cognitive Development, ed. U. Goswami, 2004. Blackwell, Malden, MA, 371. 10. A. Kraft, A. M. Strickland and G. Bhattacharyya, Chem. Educ. Res. Pract., 2010, 11(4), 281. 11. G. Şendur, Chem. Educ. Res. Pract., 2020, 21(1), 113. 12. J. D. Novak., J. J. Mintzes and J. H. Wandersee, in Assessing Science Understanding. A Human Constructivist View, ed. J. J. Mintzes, J. H. Wandersee and J. D. Novak, San Diego, Academic Press, 2000, p. 355. 13. M. Nieswandt and K. Bellomo, J. Res. Sci. Teach., 2009, 46(3), 333. 14. M. B. Nakhleh and R. C. Mitchell., J. Chem. Educ., 1993, 70(3), 190. 15. L. C. Hodges and L. C. Harvey, J. Chem. Educ., 2003, 80(7), 785.

194

Chapter 11

16. A. F. M. Fahmy and J. J. Lagowski, J. Chem. Educ., 2003, 80(9), 1078. 17. J. D. Novak and D. B. Gowin, Learning to Learn, Cambridge University Press, Cambridge, 1984. 18. T. Vachliotis, K. Salta and C. Tzougraki, Think. Skills Creat., 2021, 41, 100881. 19. T. N. Hrin, M. D. Segedinac and M. D. Milenković, Afr. J. Chem. Educ., 2013, 3(2), 76. 20. A. F. M. Fahmy and J. J. Lagowski, Afr. J. Chem. Educ., 2011, 1(1), 29. 21. T. N. Hrin, D. D. Milenković and M. D. Segedinac, Int. J. Sci. Math. Educ., 2016, 14(5), 805. 22. T. Vachliotis, K. Salta, P. Vasiliou and C. Tzougraki, J. Chem. Educ., 2011, 88(3), 337. 23. A. F. M. Fahmy, Afr. J. Chem. Educ., 2014, 4(2), 2. 24. A. F. M. Fahmy and J. J. Lagowski, Chem. Educ. Int., 2002, 3(1), http:// www.list.iupac.org/publications/cei/vol3/0301x0an1.html, accessed 28 September 2021. 25. A. F. M. Fahmy and J. J. Lagowski, Afr. J. Chem. Educ., 2012, 2(2), 66. 26. T. N. Hrin, A. F. M. Fahmy, M. D. Segedinac and D. D. Milenković, Res. Sci. Educ., 2016, 46(4), 525–546. 27. A. F. M. Fahmy and J. J. Lagowski, Afr. J. Chem. Educ., 2014, 4(4), 35. 28. C. Tzougraki, K. Salta and T. Vachliotis, Afr. J. Chem. Educ., 2014, 4(2), 101. 29. T. N. Hrin, D. D. Milenković, M. D. Segedinac and S. Horvat, J. Serb. Chem. Soc., 2016, 81(12), 1455. 30. T. N. Hrin, D. D. Milenković and M. D. Segedinac, Chem. Educ. Res. Pract., 2018, 19(1), 305. 31. D. Cabrera and L. Colosi, Eval. Program Plan., 2008, 31(3), 311. 32. M. Nazir and I. I. Naqvi, Afr. J. Chem. Educ., 2011, 1(2), 59. 33. M. Nazir, I. I. Naqvi and R. Khattak, Afr. J. Chem. Educ., 2013, 3(1), 79. 34. S. B Golemi, R. Keçira, N. Medja and D. Lacej, Afr. J. Chem. Educ., 2013, 3(1), 74. 35. S. B. Golemi, Afr. J. Chem. Educ., 2017, 7(3), 98. 36. A. Firdous, M. Nazir and I. I. Naqvi, Afr. J. Chem. Educ., 2015, 5(2), 59. 37. A. F. M. Fahmy and J. J. Lagowski, Afr. J. Chem. Educ., 2015, 5(1), 44. 38. F. G. W. C. Paas and J. J. G. Van Merriënboer, Hum. Factors, 1993, 35(4), 737. 39. T. N. Hrin, D. D. Milenković, M. D. Segedinac and S. Horvat, Think. Skills Creat., 2017, 23, 175.

Chapter 12

Variations in the Teaching of Resonance—An Exploration of Organic Chemistry Instructors’ Enacted Pedagogical Content Knowledge† Emily L. Atieha, Jherian K. Mitchell-Jonesa, Dihua Xueb and Marilyne Stains*a a

Department of Chemistry, University of Virginia, 409 McCormick Road, Charlottesville, VA 22904-4319, USA; bCenter for Learning Innovation, University of Minnesota Rochester, 111 South Broadway, Suite 300, Rochester, MN 55904, USA *E-mail: [email protected]

12.1  Introduction Within the organic chemistry curriculum, resonance is a fundamental concept used to explain a broad array of phenomena, including structural properties, reactivity, and spectroscopy. In fact, resonance is described or implicated to some extent in 6 of the 10 undergraduate organic chemistry anchoring concepts developed by the American Chemical Society.1 Previous †

Electronic supplementary information (ESI) available. See DOI: 10.1039/9781839167782

  Advances in Chemistry Education Series No. 10 Student Reasoning in Organic Chemistry: Research Advances and Evidence-based Instructional Practices Edited by Nicole Graulich and Ginger Shultz © The Royal Society of Chemistry 2023 Published by the Royal Society of Chemistry, www.rsc.org

195

196

Chapter 12

studies looking at faculty perceptions of salient concepts in organic chemistry have also highlighted the importance of resonance.2,3 For example, Bhattacharyya showed that faculty identified resonance as a valuable prerequisite skill concept necessary for electron-pushing formalism in the context of mechanistic reasoning.2 The faculty rated the ability to draw resonance hybrids among the most important skills for students to develop. Further underscoring its significance, Betancourt-Perez and Oliviera identified a relationship between students’ understanding of resonance and their overall grades in organic chemistry.4 Still, the concept of resonance continues to pose a conceptual challenge for students. Although covered in the general chemistry curriculum, studies suggest students often have difficulty transferring their knowledge into organic chemistry.5 Multiple studies have identified alternate conceptions that students hold related to resonance, including confusion about what it means for molecules to undergo resonance.3,4,6,7 Students may struggle with visualization of the resonance hybrid (i.e. the structure that actually exists in nature), often holding the incorrect idea that resonance structures exist as their own entities that “oscillate” or “equilibrate” back-and-forth between resonance forms. These difficulties may even persist into the second semester of organic chemistry.4 Student learning is invariably linked to the instruction they receive; indeed, research has shown that the instructor is the most significant factor impacting student learning in the classroom.8 For example, some misconception may be attributed in large part to the representations that students are exposed to in the course. Taber found that the classic representation of aromatic rings, depicted as hexagons with a circle in the middle, was problematic for some students as they interpreted the circle to be a type of reservoir of electrons.7 Similarly, Kim et al. suggested that Kekulé structures may pose barriers for students’ understanding of resonance structures.9 The authors found that even after using the Kekulé structures in a guided-inquiry approach, the majority of students still believed that individual resonance structures existed as distinct entities, a testament of the endurance of this misconception. However, the same study posited that instructors could combat this misconception by developing students’ metarepresentational competencies. When asked to create their own representations beyond the traditional structures (e.g., Kekulé structures) and evaluate each one’s strengths and limitations, students shifted towards a unitization view, acknowledging that the real structure was an average of the resonance structures. Moreover, Carle and Flynn suggest that teaching students to draw and interpret the hybrid structures would help to counter the misconception that resonance structures exist in nature.10 They identified these abilities among ten learning objectives most pertinent to resonance/delocalization in organic chemistry. In their analysis of introductory organic chemistry textbooks, the authors found that the concept of the hybrid structure was inconsistently covered and generally underrepresented, even omitting visual representations of it entirely. When textbooks did discuss the hybrid structure, some relied on heuristics to explain relative contributions to the hybrid structure, rather than

Variations in the Teaching of Resonance

197

causal links. It is not surprising then that students often resort to operational definitions of resonance, such as how to draw structures, rather than a conceptual definition.6,11 Further, Carle and Flynn found that the hybrid structure was severely under-represented in assessments and textbook practice problems, representing less than 1% of the 1548 resonance practice problems in total.10 For the last few decades, research on effective teaching has focused on evaluating teacher’s pedagogical knowledge, with a growing interest in looking at the topic-specific level.12 This topic-specificity lends itself to a type of teacher knowledge known as pedagogical content knowledge (PCK), defined as the combination of knowledge and skills that an instructor uses to teach a specific topic to a specific group of students in a particular setting.13 This construct has been widely discussed,14–16 dissected,17–19 and even applied as a framework for examining and characterizing instructors’ knowledge about teaching and their actions in the classroom.20 Thus far, much of the PCK literature has been dominated by studies looking at K–12 teachers, despite its value and applicability to higher education21 and fewer studies have looked specifically at higher education faculty in the organic chemistry courses. Studies looking at graduate teaching assistants (GTAs) found that while PCK was somewhat commensurate with experience, GTAs’ PCK was still lacking overall.22–24 For example, GTAs relied on heuristics and other shallow explanations when teaching thin layer chromatography23 and had limited diversity in their teaching strategies with regard to teaching organic mechanisms.24 Further, while strong content knowledge appeared to be necessary to develop PCK, it was not necessarily sufficient in itself. Davidowitz and Potgieter reached similar conclusions in their study on the content knowledge and PCK of twelfth grade organic chemistry teachers.25 Given the topic- and context-dependency of PCK, more work is needed to address the many unanswered questions about the nature of PCK within higher education STEM courses. In this study, we use PCK as a lens to explore seven post-secondary organic chemistry instructors’ teaching of the resonance hybrid. We also administered a survey to the instructors’ students to understand if and how instructors’ PCK could manifest through learning outcomes.

12.2  Theoretical Framework 12.2.1  PCK in the Sciences Pedagogical content knowledge (PCK) was first named by Shulman as one type of teacher content knowledge, and described as the “special amalgam” of content and pedagogy that is unique to teachers, distinct from general pedagogical knowledge and content knowledge, among others.26,27 Several models of PCK have been developed since,17,28,29 including the “Magnusson model,” which was designed specifically for use in the sciences.30 This PCK model is comprised of five components, though the precise names and definitions have seen slight modifications to better suit researchers’ contexts. Four of the components and their definitions are listed in Table 12.1.

Chapter 12

198

Table 12.1  PCK  components according to the Magnusson model. The first column

contains four PCK components, along with their abbreviations, from the Magnusson model.30 Sub-components are the specific types of teacher knowledge contained within each component.

PCK component (Magnusson model) [KoC] Knowledge of Curriculum

[KoS] Knowledge of Students

[KoIS] Knowledge of Instructional Strategies [KoA] Knowledge of Assessment

Subcomponents ● Goals and important content students need to know. ● Resources and tools that exist to teach the topic. ● Vertical and horizontal curriculum. ● About students (their majors, demographics, etc.). ● Material students find easy/difficult. ● Prior knowledge students hold. ● Students’ affective domain (motivations, attitudes, etc.). ● Teaching methods for the topic. ● Order and structure of the content. ● Assessment methods to gauge student understanding. ● Content that is important to assess.

It is pertinent to note that Magnusson et al.30 included a fifth construct, “Orientations to Teaching Science,” as an overarching PCK component from which the other four components are shaped. However, this component has historically been difficult to operationalize consistently31 and was thus excluded from our framework. Beyond measuring the individual PCK components, Park and Oliver argued that it is the ability to integrate the various PCK components that defines an instructor’s effectiveness.32 Subsequent work has since measured PCK component integrations, often using the Magnusson model as a framework.33–36

12.2.2  Coming to a Consensus on PCK Attempts to reconcile the definition of PCK came in the form of summits in 2012 and 2016. These meetings produced the Consensus Model in 2015 16 and later the Refined Consensus Model (RCM).19 The RCM depicts three types of PCK and the knowledge that comprises them (Figure 12.1). The outermost edge of the RCM includes five professional knowledge bases: content knowledge, curricular knowledge, assessment knowledge, knowledge of students, and pedagogical knowledge. These knowledge bases largely overlap with the four PCK components of the Magnusson model (minus the content knowledge) and for consistency, the remainder of this paper will refer to them as “PCK components”. The first layer of PCK is collective PCK (cPCK), defined as the public professional knowledge held by multiple educators in the field, which may be shared by a community of practice or found in the literature.

Variations in the Teaching of Resonance

199

Figure 12.1  The  refined consensus model depicts the three layers of PCK (cPCK, pPCK, and ePCK) surrounded by five knowledge bases. Reproduced from ref. 20 with permission from Springer Nature, Copyright 2019.

Nested within cPCK is personal PCK (pPCK), which is the total of the individual professional knowledge held by an instructor and is shaped by their experiences over time. The third type, enacted PCK (ePCK), is a subset of pPCK, defined as the professional knowledge that instructors use while planning to teach, during the act of teaching, or while reflecting on their teaching. Filters and amplifiers, such as teaching context, student feedback, and instructor beliefs and experiences are intercalated between the layers and influence the composition of each type of PCK.19

12.2.3  Tying It All Together This study looked at organic chemistry professors and their PCK with respect to their teaching of the resonance hybrid and was guided by the following two research questions (RQs):    1. What is the nature of the enacted pedagogical content knowledge (ePCK) of organic chemistry instructors in the context of teaching the resonance hybrid? 2. To what extent do students’ abilities to describe and draw a resonance hybrid derived from two resonance forms relate to their instructors’ ePCK on this topic?   

200

Chapter 12

We leveraged both the RCM and Magnusson model to guide data collection and analysis. As we were most interested in eliciting the instructors’ perspectives of what they do in the classroom while teaching the resonance hybrid, we focused on their ePCK. Using the RCM’s definition of ePCK, the interviews sought to capture the instructors’ descriptions of their lesson plans on teaching the resonance hybrid and their reflections on prior experiences doing so. The four PCK components and subcomponents of the Magnusson model (Table 12.1) provided a starting point to extract and characterize the instructors’ ePCK (RQ1) and facilitated a direct comparison with students’ outcomes (RQ2).

12.3  Methods This multimethod research study,37 which was approved by the institutional review boards at the University of Nebraska, Lincoln and the University of Virginia, relied on two different sources of data from two populations: interviews from organic chemistry instructors and survey data from their students.

12.3.1  Participants The recruitment of organic chemistry instructors consisted of two approaches. First, we invited instructors from the authors’ institutions. Second, we requested permission from the organizers of symposia at the Biennial Conference in Chemical Education (BCCE) that were focused on organic chemistry to make a recruiting announcement during the break. Through these two approaches, fifteen instructors were recruited from six different institutions (A–F, with two instructors from Institution A) across the United States that vary in size and institute classification (Table 12.2). Following instructor recruitment, we were able to collect online survey data from the students of seven of these instructors in their first semester organic chemistry courses. In total, 361 students provided complete surveys. Demographic data for these students were not collected. In this chapter, we focus on the seven instructors from whom we were able to collect student data.

12.3.2  Data Collection A semi-structured interview protocol was developed to elicit instructors’ ePCK in the teaching of resonance within a first semester organic chemistry course. The interview protocol consisted of open-ended questions that prompted the instructor to describe how they taught resonance (see Electronic Supplementary Information, ESI). The instructors who were recruited from the authors’ institutions were interviewed within the privacy of their own offices, while the instructors recruited at BCCE were interviewed either during the conference or later via the virtual platform Zoom.

Variations in the Teaching of Resonance

201

To explore the relationship between instructors’ ePCK and student understanding, a six-question online survey was used to probe student conceptions of the concept of resonance (see ESI). Students were allowed to type their responses to the short-answer questions and upload hand-drawn structures where prompted. These surveys were collected at Institution A in Spring 2018 and at Institutions B–F in Fall 2018. The surveys were distributed after students had been assessed on the topic of resonance (typically after the first mid-term exam). Table 12.2  Characteristics  of sample. The first column of this table lists the iden-

tification codes that will be used to describe each instructor in this study. The subsequent five columns list information regarding the background of each instructor participant. The final column shows the number of student survey responses from each course. Institution

ID A1 A2

B

C

D E

F

a

Carnegie classification (Student enrollment) Doctoral Universities: Very High Research Activity (25 000) Baccalaureate Colleges: Arts & Sciences Focus (1500) Four-Year Bachelor’s Degree: Midsizea ( 0.95). Models for connectivity of molecules and stereochemistry features that both have dynamic counterparts (i.e., bond breaking and making and formation of stereochemistry, respectively) tended to perform less well than models for static features without a similar dynamic feature. In this case, multiple features have similar markers (i.e., words and phrases) in students’ writing, making the models more difficult to train. The cause–effect only model reached an accuracy of 79% with Cohen’s kappa and MCC of 0.585. Although this model did not perform as well as the other models, it still performs quite well for an ATA model of cause-and-effect reasoning, particularly with the use of a CNN model, which requires minimal computational resources to train. Cause and effect reasoning is challenging for an ATA model to recognize because it can be nuanced or implied within the writing (i.e., markers of cause and effect such as “because of” or “due to” may not always be present).65–67 Typically, complex machine learning models, which require much greater computational resources to train, must be used to model more complex reasoning within the text.9,66,68,69 We believe that our electronic causal reasoning model performed better than the cause–effect only model due to a greater presence of explicit markers in students’ writing, such as the indication of electronic properties alongside language about electron movement.

17.6  Implications 17.6.1  Implications for Research We have presented a modified version of the mechanistic reasoning framework presented by Russ et al.51 and adapted by Watts et al.8 as a lens through which to analyze students’ mechanistic reasoning about three different

300

Chapter 17

organic chemistry reaction mechanisms. Other researchers can use this modified framework to study how students reason about additional reaction mechanisms and to study how specific writing prompts elicit features of mechanistic reasoning. Researchers can use similar methods to develop ATA models to evaluate students’ writing using other frameworks and writing prompts. These automated models can be used for further research to develop instructor- and student-facing feedback platforms that can provide formative, immediate feedback. Furthermore, automated models can be used to evaluate the impact of interventions on students’ use of features of mechanistic reasoning, such as tutorials based on findings in the literature.5,6

17.6.2  Implications for Practice Instructors can use or modify the WTL prompts described in this study to support students’ mechanistic reasoning. Furthermore, our findings indicate the careful attention instructors should place when developing prompts and ensuring that the language is specific to elicit the desired responses from students. Instructors can also use the framework described in this study to develop constructed-response items and WTL prompts that elicit features of mechanistic reasoning. Instructors may use these items in their courses for both formative and summative assessment and can use the framework to support students’ considerations of all components necessary for mechanistic reasoning. If you are an instructor and are interested in using our ATA models for formative assessment, please contact us.

17.7  Limitations The work presented in this chapter provides an example of a broadly applicable mechanistic reasoning framework and corresponding ATA models which have been used for students’ written descriptions and explanations of three different mechanisms. However, there are many reaction mechanisms in the organic chemistry curriculum, and we cannot assume that the framework and models will perform similarly with other mechanisms. Further testing with different mechanism types is required to broaden the scope of these models. Additionally, all the training and testing data was collected from students’ final drafts in one course at one university. There are many ways to describe and explain mechanisms that the students in this study may not have exemplified in their final drafts. More work is required with additional data sources and populations of students to determine if the models perform similarly with different populations of students. Lastly, the approach of analyzing students’ writing is limited in that their writing might not capture their full understanding or reasoning. While the models presented here are a useful tool to provide students and instructors with formative feedback, we do not recommend their use for summative feedback due to the limited population the data came from.

Machine Learning Models for Automated Analysis of Written Descriptions

301

17.8  Conclusions This study presented a modified version of Russ et al. and Watts et al.’s mechanistic reasoning framework applied to students’ written explanations of three different organic chemistry reaction mechanisms. The presence of features necessary for mechanistic reasoning varied based on the nature of the writing prompt and the specific mechanism students were explaining. Students’ responses were used to train several ATA models that can successfully predict whether students included these features in their written responses. The results of this study show that the modified mechanistic reasoning framework presented can be applied to identify features of students’ writing across multiple types of mechanisms. The analysis across students’ writing for different mechanisms indicate that the nature of the prompt influences how mechanistic reasoning is elicited. This study additionally indicates that CNN models can be used to successfully provide automated identification of features necessary for mechanistic reasoning in students’ writing. These findings extend the literature by indicating that ATA models can be successfully deployed across students’ writing for up to three different organic reaction mechanisms with utility for identifying students’ written descriptions of how and why mechanisms occur.

References 1. M. Cooper, H. Kouyoumdjian and S. Underwood, J. Chem. Educ., 2016, 93, 1703. 2. O. Crandell, H. Kouyoumdjian, S. Underwood and M. Cooper, J. Chem. Educ., 2018, 96, 213. 3. O. M. Crandell, M. A. Lockhart and M. M. Cooper, J. Chem. Educ., 2020, 97, 313. 4. A. J. Dood, J. C. Dood, D. Cruz-Ramírez de Arellano, K. B. Fields and J. R. Raker, J. Chem. Educ., 2020, 97, 3551. 5. A. J. Dood, K. B. Fields and J. R. Raker, J. Chem. Educ., 2018, 95, 1267. 6. A. J. Dood, J. C. Dood, D. Cruz-Ramírez de Arellano, K. B. Fields and J. R. Raker, Chem. Educ. Res. Pract., 2020, 21, 267. 7. A. J. Dood, K. B. Fields, D. Cruz-Ramírez de Arellano and J. R. Raker, Can. J. Chem., 2019, 97, 711. 8. F. M. Watts, J. A. Schmidt-McCormack, C. A. Wilhelm, A. Karlin, A. Sattar, B. C. Thompson, A. R. Gere and G. V. Shultz, Chem. Educ. Res. Pract., 2020, 21, 1148. 9. B. A. Winograd, A. J. Dood, R. Moeller, A. Moon, A. Gere and G. Shultz, in LAK21: 11th International Learning Analytics and Knowledge Conference, Association for Computing Machinery, New York, NY, USA, 2021, pp. 586–591. 10. B. J. Yik, A. J. Dood, D. C.-R. de Arellano, K. B. Fields and J. R. Raker, Chem. Educ. Res. Pract., 2021, 22, 866. 11. K. Noyes, R. L. McKay, M. Neumann, K. C. Haudek and M. M. Cooper, J. Chem. Educ., 2020, 97, 3923.

302

Chapter 17

12. B. A. Winograd, A. J. Dood, S. A. Finkenstaedt-Quinn, A. R. Gere and G. V. Shultz, in 14th Computer-Supported Collaborative Learning (CSCL) Proceedings, Bochum, Germany, 2021, pp. 11–18. 13. W. Goodwin, in Philosophy of Chemistry, ed. A. I. Woody, R. F. Hendry and P. Needham, North-Holland, Amsterdam, 2012, vol. 6, pp. 309–327. 14. W. Goodwin, Ann. N. Y. Acad. Sci., 2003, 988, 141. 15. R. Kozma, Learn. Instr., 2003, 13, 205. 16. G. Bhattacharyya, J. Chem. Educ., 2013, 90, 1282. 17. G. Bhattacharyya and G. M. Bodner, J. Chem. Educ., 2005, 82, 1402. 18. N. Grove, M. Cooper and K. Rush, J. Chem. Educ., 2012, 89, 844. 19. R. Ferguson and G. M. Bodner, Chem. Educ. Res. Pract., 2008, 9, 102. 20. A. B. Flynn and R. B. Featherstone, Chem. Educ. Res. Pract., 2017, 18, 64. 21. T. L. Anderson and G. M. Bodner, Chem. Educ. Res. Pract., 2008, 9, 93. 22. S. B. Wilson and P. Varma-Nelson, J. Chem. Educ., 2019, 96, 25. 23. M. E. Anzovino and S. L. Bretz, Chem. Educ. Res. Pract., 2015, 16, 797. 24. M. E. Anzovino and S. L. Bretz, Chem. Educ. Res. Pract., 2016, 17, 1019. 25. R. S. DeFever, H. Bruce and G. Bhattacharyya, J. Chem. Educ., 2015, 92, 415. 26. M. Cooper, N. Grove, S. Underwood and M. Klymkowsky, J. Chem. Educ., 2010, 87, 869. 27. M. Cooper, L. Corley and S. Underwood, J. Res. Sci. Teach., 2013, 50, 699. 28. A. Kraft, A. M. Strickland and G. Bhattacharyya, Chem. Educ. Res. Pract., 2010, 11, 281. 29. N. E. Bodé, J. M. Deng and A. B. Flynn, J. Chem. Educ., 2019, 96, 1068. 30. G. Bhattacharyya, Chem. Educ. Res. Pract., 2008, 9, 84. 31. S. A. Finkenstaedt-Quinn, F. M. Watts, M. N. Petterson, S. R. Archer, E. P. Snyder-White and G. V. Shultz, J. Chem. Educ., 2020, 97, 1852. 32. M. N. Petterson, F. M. Watts, E. P. Snyder-White, S. R. Archer, G. V. Shultz and S. A. Finkenstaedt-Quinn, Chem. Educ. Res. Pract., 2020, 21, 878. 33. F. M. Watts, I. Zaimi, D. Kranz, N. Graulich and G. V. Shultz, Chem. Educ. Res. Pract., 2021, 22, 364. 34. D. Cruz-Ramírez de Arellano and M. H. Towns, Chem. Educ. Res. Pract., 2014, 15, 501. 35. N. Graulich, S. Hedtrich and R. Harzenetter, Chem. Educ. Res. Pract., 2019, 20, 924. 36. M. L. Weinrich and H. Sevian, Chem. Educ. Res. Pract., 2017, 18, 169. 37. I. Caspari, M. L. Weinrich, H. Sevian and N. Graulich, Chem. Educ. Res. Pract., 2018, 19, 42. 38. I. Caspari, D. Kranz and N. Graulich, Chem. Educ. Res. Pract., 2018, 19, 1117. 39. M. Cooper and M. Klymkowsky, J. Chem. Educ., 2013, 90, 1116. 40. M. Cooper, R. L. Stowe, O. M. Crandell and M. W. Klymkowsky, J. Chem. Educ., 2019, 96, 1858. 41. A. Moon, R. Moeller, A. R. Gere and G. V. Shultz, Chem. Educ. Res. Pract., 2019, 20, 484. 42. B. I. Grimberg and B. Hand, Int. J. Sci. Educ., 2009, 31, 503.

Machine Learning Models for Automated Analysis of Written Descriptions

303

43. A. Moon, A. R. Gere and G. V. Shultz, Sci. Educ., 2018, 102, 1007. 44. L. Flower and J. R. Hayes, Coll. Compos. Commun., 1981, 32, 365. 45. L. Flower and J. R. Hayes, Writ. Commun., 1984, 1, 120. 46. J. R. Hayes, in The Science of Writing, ed. C. M. Levy and S. Ransdell, Routledge, 1996. 47. S. A. Finkenstaedt-Quinn, M. Petterson, A. Gere and G. Shultz, J. Chem. Educ., 2021, 98, 1548. 48. P. Anderson, C. M. Anson, R. M. Gonyea and C. Paine, Res. Teach. Engl., 2015, 50, 199. 49. A. R. Gere, N. Limlamai, E. Wilson, K. MacDougall Saylor and R. Pugh, Writ. Commun., 2019, 36, 99. 50. T. Gupte, F. M. Watts, J. A. Schmidt-McCormack, I. Zaimi, A. R. Gere and G. V. Shultz, Chem. Educ. Res. Pract., 2021, 22, 396. 51. R. S. Russ, R. E. Scherr, D. Hammer and J. Mikeska, Sci. Educ., 2008, 92, 499. 52. P. K. Machamer, L. Darden and C. F. Craver, Philos. Sci., 2000, 67, 1. 53. L. Darden, Philos. Sci., 2002, 69, S354. 54. L. Keiner and N. Graulich, Chem. Educ. Res. Pract., 2020, 21, 469. 55. A. P. Kirilenko and S. Stepchenkova, PLoS One, 2016, 11, e0149787. 56. F. M. Watts and S. A. Finkenstaedt-Quinn, Chem. Educ. Res. Pract., 2021, 22, 565. 57. F. Pedregosa, G. Varoquaux, A. Gramfort, V. Michel, B. Thirion, O. Grisel, M. Blondel, P. Prettenhofer, R. Weiss, V. Dubourg, J. Vanderplas, A. Passos, D. Cournapeau, M. Brucher, M. Perrot and É. Duchesnay, J. Mach. Learn. Res., 2011, 12, 2825–2830. 58. G. van Rossum and F. L. Drake, The Python Language Reference Manual, Network Theory Ltd, 2011. 59. F. Chollet, Keras, GitHub, Github Repository, 2015, https://github.com/ fchollet/keras, accessed October 2022. 60. J. Cohen, Educ. Psychol. Meas., 1960, 20, 37. 61. D. Chicco and G. Jurman, BMC Genomics, 2020, 21, 6. 62. Y. LeCun, Y. Bengio and G. Hinton, Nature, 2015, 521, 436. 63. F. Chollet, in Deep Learning with Python, Manning Publications Company, 2018, pp. 178–232. 64. D. M. Williamson, X. Xi and F. J. Breyer, Educ. Meas. Issues Pract., 2012, 31, 2. 65. E. Blanco, N. Castell and D. Moldovan, in Proceedings of the Sixth International Conference on Language Resources and Evaluation (LREC’08), European Language Resources Association (ELRA), Marrakech, Morocco, 2008. 66. T. Dasgupta, R. Saha, L. Dey and A. Naskar, in SIGDIAL Conference, 2018. 67. N. Asghar, 2016, arXiv:160507895 Cs. 68. I. Hendrickx, S. N. Kim, Z. Kozareva, P. Nakov, D. Ó. Séaghdha, S. Padó, M. Pennacchiotti, L. Romano and S. Szpakowicz, 2019, arXiv:191110422 Cs. 69. S. Zhao, Q. Wang, S. Massung, B. Qin, T. Liu, B. Wang and C. Zhai, in Proceedings of the Tenth ACM International Conference on Web Search and Data Mining, Association for Computing Machinery, New York, NY, USA, 2017, pp. 335–344.

Chapter 18

Development of a Generalizable Framework for Machine Learning-based Evaluation of Written Explanations of Reaction Mechanisms from the Post-secondary Organic Chemistry Curriculum Jeffrey R. Raker*a, Brandon J. Yika and Amber J. Doodb a

University of South Florida, Tampa, Florida, USA; bUniversity of Michigan, Ann Arbor, Michigan, USA *E-mail: [email protected]

18.1  A  re Drawn Reaction Mechanisms Enough to Evaluate Understanding? Drawn pictures of reaction mechanisms are insufficient to evaluate understanding.1–3 The meaning ascribed by learners to reaction mechanisms does not mirror the meaning ascribed by practicing chemists.4–9 Reaction mechanisms, as depicted using the electron-pushing formalism, are   Advances in Chemistry Education Series No. 10 Student Reasoning in Organic Chemistry: Research Advances and Evidence-based Instructional Practices Edited by Nicole Graulich and Ginger Shultz © The Royal Society of Chemistry 2023 Published by the Royal Society of Chemistry, www.rsc.org

304

Generalizable Framework for Machine Learning-based Evaluation

305 6,10

representations of the stepwise process of a chemical reaction. Reaction mechanisms, as well, are used to predict and explain experimental observation.8–11 Thus, understanding of reaction mechanisms should be evaluated based on the meaning ascribed to the mechanistic pictures rather than just creation of such mechanistic pictures.12,13 In the context of organic chemistry instruction, this would require oral or written explanations of what is being depicted by a given mechanistic picture and a mechanistic picture drawn by the learner.14 For educators who teach small (for argument purposes, assume less than 20 students) enrollment courses, asking learners to provide such explanations on assessments, while time intensive to provide feedback on, does not account for a large overall amount of added time. However, for educators who teach large (assume 100 or more students) enrollment courses, even with the aid of teaching assistants, providing feedback on such assessments can become burdensome. And, for all educators wanting to use such explanations in just-in-time teaching, with a classroom response system, or by other formative means, providing meaningful feedback from these assessments cannot be done in real time due to the time needed to evaluate explanations.15,16 Computer-based scoring methods have thus been built using machine learning and text analysis tools to address this concern. The authors of this chapter have developed and disseminated such tools.16–18 Our work expands beyond providing assessment data to educators; we have also evaluated the use of our computer-based scoring methods in providing adaptative learning experiences.15,19 In this chapter we will summarize our body of work in this area while setting a course for where we envision our (and others') future work. The research literature is clear: current assessment strategies are insufficient in measuring understanding of reaction mechanisms. We offer a pathway to build better assessments tools for evaluating understanding of reaction mechanisms to address this insufficiency.

18.2  L  earner Understanding of Reaction Mechanisms While some learners construct a foundational understanding of reaction mechanisms, most students gain only a surface-level understanding.2,5,12,13,20–23 For example, in a study of graduate-level students pursuing research in organic chemistry (and thus assumed to be advanced organic chemistry learners), the lines, letters, numbers, wedges, dashes, etc. of reaction mechanisms lacked the same meaning that practicing organic chemists ascribe to such mechanistic pictures.6 In a study of undergraduate-level students completing their first course in organic chemistry, these learners were found to treat mechanistic arrows as “decorations” rather than representational tools to communicate the processes of bonds breaking and forming.24

306

Chapter 18

A common implication of studies on learner understanding of reaction mechanisms is for mechanisms to be taught in authentic contexts (i.e., in the predictive and explanatory way that mechanisms are used in the research context).4,11,13,25,26 To develop such mechanistic thinking skills (i.e., the use of mechanisms to explain and predict chemical reactions), learners need to experience and construct their understanding through organic chemistry research.27 This is impractical for most settings where organic chemistry is taught; there are only so many undergraduate student researcher positions available. Therefore, at minimum, learners should be engaged in authentic tasks that mirror the practice of chemistry.11,28–30

18.3  A  ssessment of Learner Understanding of Reaction Mechanisms Drawing reaction mechanisms is a cornerstone of assessment in organic chemistry education.2,4,8–10 Many organic chemistry educators believe that when a learner constructs a reaction mechanism, that the learner ascribes the same meaning as they do to the reaction mechanism.24 Unfortunately, such assessments can be completed with little understanding of the meaning associated with a reaction mechanism.5–7,20,23 Rote memorization and surface-level study approaches, often used by learners in organic chemistry courses, are sufficient for success in drawing reaction mechanisms.31 Assessments drive learning.26 If we want students to ascribe meaning to reaction mechanisms, then we must ask learners to articulate the meaning they ascribe to mechanistic pictures.1,12–14,32 Eliciting learners' ascribed meaning can be as simple as asking learners to describe what is happening and why for a given reaction mechanism;12,15,17,33 Figure 18.1 offers an example of such an assessment item. Note: some researchers ask students to draw their own mechanism,1 while others have provided a mechanism which students must describe and explain.17 More importantly, assessment prompt formats should be carefully evaluated to ensure that the prompt elicits the desired information on learning.1,33 Engaging learners in using words to explain mechanistic pictures sets a foundation for reaction mechanisms being authentic communication tools rather than collections of lines, letters, and numbers.7,10,15,26,32 Guidelines and frameworks for developing (or transforming) assessment items in line with the practice of chemistry have been offered in the literature.1,11,26,29,34 These frameworks include sample assessment items with prescriptive item formats that can be easily envisioned for use with different reaction mechanisms and more broadly other organic chemistry content such as spectroscopy. While assessment items that promote and capture understanding of reaction mechanisms have been evaluated and reported in the literature,1,12

Generalizable Framework for Machine Learning-based Evaluation

307

Figure 18.1  An  example assessment item for eliciting understanding of a reaction mechanism.

the time necessary to implement such assessment items in an instructional context can be prohibitive. In summative assessments, such as term and final examinations, where there is an expectation that grading will take time and feedback will be delayed, assessment items reported in the research literature may be appropriate. However, if such understanding of reaction mechanisms was to be evaluated in the context of a classroom experience (i.e., formative assessment), there is insufficient time to use such assessment items. There are alternative, multiple-choice-based assessment items that circumvent this.35 However, multiple-choice assessments are antithetical to the authentic assessments that are proposed. Thus, clickers and other classroom response systems limited to categorical answer options do not measure the same understanding as a written response. Predictive models developed using machine learning automated text analysis tools provide a suitable means for addressing the need for quick and meaningful analysis of written responses. There is a growing number of assessment items that have associated computer-based predictive models.15–17,36–39 However, many of these models are particular to a specific assessment item. While this may be necessary for the concepts and skills evaluated, there is an opportunity with the understanding of reaction mechanisms in which a more generalizable predictive model could be built. In such a predictive model, understanding of reaction mechanisms would be evaluated irrespective of the mechanism. For example, understanding of the role of electrophiles in a reaction mechanism need only to be assessed with an assessment item that ask for an explanation of what is happening and why

308

Chapter 18

in a reaction that includes an electrophile. Such a generalizable predictive model has been demonstrated in organic chemistry courses for use in evaluating written explanations that correctly use the Lewis acid–base model.16 For this model, we demonstrated that a written explanation of any proton transfer reaction or a Lewis acid–base reaction could be appropriately evaluated with our model.

18.4  T  raining Machine Learning Models for Automated Text Analysis As an introduction our presentation of a proposed framework for evaluating understanding of reaction mechanisms, we offer a brief explanation for how machine learning models have been developed for use in evaluating understanding in STEM instruction.36–38,40–42 This process for our model has involved a series of steps including outlining a classification scheme, collecting and classifying data by hand, data processing and feature extraction, model training, and model evaluation. In chemistry education settings, classification algorithms have been trained to evaluate students' understanding.16,36 These classifications are the same as levels of understanding or achievement in an assessment rubric. The rubric in a machine learning context is referred to as a classification scheme. To develop models, data must be human-classified (or human-scored). Thus, a rubric is essential for consistent classification within and between any person evaluating the assessment responses. For example, in our report on a model to evaluate acid–base understanding, a rubric was developed that included phrases and example responses for when a learner was correctly and incorrectly using the Lewis acid–base model in their explanation of an acid–base reaction or when explaining the acidity or basicity of a given compound.16 It is important to note that the rubric was built using the chemical education research that has been done on understanding of acidity and basicity.18,33,43 Thus, our rubric was informed by the research literature. Next, data are collected, and human-classified based on the rubric. To ensure that suitable numbers of responses are obtained for each of the levels in the classification scheme, large numbers of responses are necessary to develop highly predictive models. Such numbers range from a couple hundred responses to over one thousand responses.36,37 While there are practical and computational limitations to infinitely large datasets, larger numbers are valuable to the model development process. For the type of generalizable model that we developed for evaluating correct Lewis acid–base use (and for the type of generalizable model that we are proposing herein), it was also necessary to collect data using a maximum variation of assessment prompts. A greater variation of prompts extends the utility of the model for evaluating any possible assessment prompt that asked for an explanation of acidity, basicity, and associated reactions. For example, our Lewis acid–base model

Generalizable Framework for Machine Learning-based Evaluation

309 16

was developed using 8520 responses from 15 assessment items. Collected data are then human-classified (i.e., “graded”) before data processing. Responses are processed, and features extracted in preparation for model building. Processing may include converting all text to lower case, removing extraneous symbols and emojis, removing unhelpful words such as “the” and “a” (i.e., stop words), fixing misspellings, and replacing synonyms.39,44,45 Processing, thus, can involve building a dictionary of stop words and replacements; while there are dictionaries available to begin this process, there are not readily available dictionaries for use in chemistry contexts. In reporting the Lewis acid–base model, we made our dictionaries open access for others to use as a starting point when developing new models.46 Feature extraction includes exploring the dataset to determine which words or combination of one or more words are most closely associated with the classification topic or concept.47 For our Lewis acid–base model, the feature extraction process resulted in 257 features that were then used to train the model.16 Model training involves evaluating a series of empirical models to determine which model performs best when predicting the human classifications. There are many possible algorithms.48 Model training, thus, requires a mix of data science and disciplinary expertise to determine which algorithm or algorithms are appropriate for the data and context. The goal when training a model is to find a balance between maximizing classification ability and preventing overtraining of the model on the training set. This is especially the case with the types of assessment data that machine learning models have been developed to evaluate.49 Last, the model is evaluated using a mix of cross-validation, split-validation, and external validation methods. Validation is accomplished by testing the trained model on different/new sets of data and can be measured using several metrics including true/false positive/negatives, Cohen's kappa,50 percent accuracy,51 the F1 metric,44 and the Matthews correlation coefficient.52–54 As with any evaluation of a measurement, an argument must be made for the suitability of the validation results with acceptable guidelines for each metric. In our model development efforts, we have been particularly interested in the evaluation metrics for our external validation set, as those data were collected after the model was trained and thus the potential unique variations in the external validation set are not captured in model training. The external validation set in our Lewis acid–base model development process (i.e., new data and new assessment prompts) met and exceeded the suite of evaluation metrics; the purpose of external validation is to demonstrate how a “first use” of the model behaves, testing the model beyond the reactions used to train and initially validate the model.16 The work done broadly in STEM education, and more specifically with the Lewis acid–base model in chemistry,16 provide a proof-of-concept foundation for considering how a generalizable framework can be used to evaluate understanding of reaction mechanisms. Such a framework would lend itself to having associated machine learning models developed such that understanding of reaction mechanisms could be evaluated quickly in formative contexts.

310

Chapter 18

18.5  F  ramework for Evaluating Understanding of Reaction Mechanisms Reaction mechanisms serve the purpose of explaining and predicting the outcome of chemical reactions including stereochemical and regiochemical outcomes.6 Mechanistic reasoning in the context of organic chemistry has been articulated through a consensus-driven survey research study of organic chemists as “the stepwise reorganization/redistribution of electrons during a chemical process” and “arises from an established body of knowledge in chemical reactions and reactivity.”10 Expert-level mechanistic reasoning requires the application and integration of multiple concepts including electronegativity, polarity, formal and partial charges, Lewis acid–base theory, and identification and classification of nucleophiles and electrophiles.10 Expert-level understanding includes both interpreting and drawing mechanistic pictures.6 Such understanding is developed through experiences applying, testing, and integrating understanding in the context of pseudo-research or authentic-research practice.11,21 Learners often consider each specific reaction mechanism as an isolated piece of knowledge,2,5 whereas experts have constructed an integrated web of reaction mechanism knowledge.55 Klein, for instance, has proposed reaction motifs in their undergraduate-level organic chemistry textbook as a structure for helping learners develop more expert-like integrated knowledge as new reaction mechanisms are learned.56 The motifs initially include nucleophilic attack, loss of a leaving group, proton transfer, and carbocation rearrangement, and further evolve throughout the text. While this structure is helpful for considering how to categorize the steps of a reaction mechanism, the reaction motifs structure does not fully capture the levels of understanding and meaning ascribed by experts. Nevertheless, the reaction motif structure provides a starting point for considering a more generalizable framework for evaluating understanding of reaction mechanisms. An evaluation of reaction mechanism understanding from a reaction motifs perspective has the potential to inform learners of their understanding and the potential to transform how educators evaluate learner understanding and implement reaction mechanism instruction and curricula. We acknowledge that others have reported frameworks that have been used to evaluate overall/ complete reaction mechanism explanations; these include levels of complexity of relations,21 mode of reasoning,12,57 and causal mechanistic characterization.1,21,23,33,58–60 The framework we report herein (i.e., levels of explanation sophistication) synthesizes and operationalizes these previously reported frameworks for use in formative assessment contexts such that machine learning models can be developed for evaluating understanding. The framework is an extension of the framework we first reported for evaluating explanations of a unimolecular nucleophilic substitution reaction.17 Development of predictive models using machine learning techniques is our key goal, and thus, our approach to considering how to evaluate understanding using the levels of explanation sophistication framework is largely influenced by that goal.

Generalizable Framework for Machine Learning-based Evaluation

311

18.5.1  Levels of Explanation Sophistication Levels of explanation sophistication provide a structure for constructing assessment rubrics (i.e., classification schemes) when evaluating understanding of reaction mechanisms, particularly for use with automated formative assessments. When considering how to synthesize and operationalize frameworks in the chemical education literature for such assessment purposes, four levels were identified.   

Absent—This level is missing from most previously reported frameworks but is important when considering how to classify responses. It is not uncommon for learners to respond to an assessment item where an explanation is required with “I don't know” or “I haven't studied this yet.” In contexts where handwritten explanations are obtained, this level captures illegible responses.1 Finally, responses whereby the assessment prompt is restated are included at this level, e.g., “This is the reaction between cyclohexene and hydrobromic acid.” No information about a learner's understanding about a concept is obtained at this level. ●● Descriptive—The descriptive level is characterized by simple descriptions. Explanations include stating the what of the reaction mechanism (e.g., “this methoxide anion attacks this carbocation” or “the Cl leaves as the leaving group in the second step”) without addressing the why of the reaction mechanism. At this level a learner is associating nomenclature and words with the structures and process described in the reaction mechanism. ●● Foundational—The foundational level is characterized by surface-level descriptions of why the reaction mechanism is occurring. Learners provide simple evaluations based on heuristics such as “iodide is a good leaving group” or “a tertiary carbocation is a stable intermediate”; implicit features such as “good/bad” or “stability” are not explained and justified. At this level a learner is beginning to associate concepts and principles for why a reaction is occurring to the mechanism they are explaining. ●● Complex—The complex level is characterized by a deep explanation that addresses both explicit and implicit features including description of how electronics and sterics provide the why for reaction mechanisms. Learners provide explanations such as “iodine is a good leaving group due to its large electron shell and thus polarizability” and “the tertiary carbocation intermediate is stable due to hyperconjugation and inductive effects.” At this a learner is integrating and applying their knowledge when explaining why the reaction mechanism is occurring.    These four levels of sophistication are hierarchical: absent, descriptive, foundational, and then complex; the levels have associated examples rooted in the teaching experiences of the authors and examples reported in the research literature.17 ●●

312

Chapter 18

Summative and high-stakes assessment of such understanding is not common. Given the research literature on the time necessary to develop expert-level understanding of reaction mechanisms (including significant experience in the context of organic chemistry research),27 it may be more appropriate to consider such assessment of understanding for formative assessment purposes.61 Nearly all research reported on evaluating explanations in STEM education using automated text analysis, suggest limiting to use of such assessments to formative purposes where grades are received based on completion and not correctness.15,17 Correctness, though, provides an added layer of evaluation. For instance, one could imagine a scenario whereby a complex level of understanding is demonstrated but there is an aspect of the explanation that is incorrect. How would this be classified or scored? It has been our approach that we are trying to promote good understanding of reaction mechanisms and thus correctness is important, while simultaneously recognizing that learning is being demonstrated even when an incorrect explanation is offered.17 We have previously reported both types of evaluation and associated predictive models: for example, Lewis acid–base use18 and correct/incorrect Lewis acid–base use.16 The predictive model used by an educator can thus be determined within the specific instructional and pedagogical goals of that educator.

18.5.2  Evaluating Understanding of Electrophiles Our proposed framework deviates from the aforementioned frameworks in that we do not suggest an overall level of sophistication can be determined for a given explanation; this includes deviating from our own work.17 As acknowledged in our work on developing a model to evaluate explanations of a unimolecular nucleophilic substitution reaction, different levels of explanation sophistication can be demonstrated by a given learner within the context of a single explanation.17 This is also supported in the literature in that different components of a reaction (e.g., a nucleophile or an electrophile) are not equally understood.62,63 Thus, a descriptive level could be demonstrated when describing a nucleophile/electrophile bond forming step, and a complex level could be demonstrated, in the same explanation, when describing the formation of a carbocation intermediate. Based on the levels of explanation sophistication framework, descriptions of each level with associated examples could be developed for a series of reaction aspects, including    ●● nucleophiles ●● electrophiles ●● carbocation stability ●● leaving group stability ●● proton transfer ●● solvent effects.   

Generalizable Framework for Machine Learning-based Evaluation

313

Thus, granularity involving a mechanistic step or mechanistic component for evaluating explanations of reaction mechanisms is proposed. Operationalizations of the levels of explanation sophistication framework based on aspects of a reaction provide a means to bridge research on understanding of these different aspects of reactivity with implementation in instructional practice. For example, there is a growing body of knowledge on learner understanding of electrophiles.62,63 We report a potential rubric/classification scheme for how the levels of explanation sophistication framework could be operationalized (see Table 18.1). The examples provided are hypothetical responses informed by the research literature and do not reflect actual collected data from learners. Table 18.1 is informed by the work of Anzovino and Bretz,62,63 among others, on learner understanding of electrophiles. For example, learners heavily rely on explicit features to identify electrophiles.62,63 These explicit features, such as charges and lone pairs, are structural in nature and used by students to explain reactivity of electrophiles.62,63 Students tend to partake in rote memorization strategies of features related to electrophilic behaviour rather than engaging with deeper and complex relationships using both explicit and implicit features that have explanatory power over the reactivity of electrophiles.62,63 The distinction between the foundational and complex levels are grounded in this literature; foundational level learners make use of explicit and implicit features, but rote memorization hinders a deeper explanation of phenomena whereas complex level learners are able to fully emphasize how implicit features govern reactivity. What we realized in our study of Lewis acid–base model use is that the language that chemists use to explain such reactions is irrespective of the specific reaction.16 In considering how to extrapolate our Lewis acid–base work to a broader context of reactions, we considered what language we would use to describe electrophiles, nucleophiles, intermediates, etc.; we noted the same generalization. Thus, if responses could be collected for each of the levels of explanation sophistication, and with a maximum variation number of specific reactions, suitable predictive models using machine learning classification techniques could be developed. In other words, a predictive model could be developed that classifies the level of explanation sophistication for electrophiles, for example, regardless of the reaction, assuming the reaction included an electrophile (for example, a sigmatropic rearrangement would not involve an electrophile in its concerted reaction mechanism).

18.6  Implications for Educators The primary focus of this work is on informing instruction. How educators can use this work is central to our purpose. We suggest three key implications for educators. First, the process of operationalizing the framework for different aspects of reaction mechanisms provides generalizable tools for assessing understanding of reaction mechanisms that may not have been assessed prior in their courses. In addition, we expect that educators will

Table 18.1  Levels  of explanation sophistication for electrophiles. Example 2

This is a hydrolysis of an acid chloride

This is a reaction of Grignard reagent

The methoxide is attacking the carbon in the The C–Mg bond is attacking the carbon in carbon dioxide and forms a new bond between the two carmiddle and the electrons in the C=O bond bons. The electrons in the double bond becomes a are going to the oxygen as a lone pair. Then, lone pair on one of the oxygens. Then, the oxygen that lone pair comes back down to reform will grab the proton from HCl to form the product the double bond and the chloride leaves to form the product Methoxide is a good nucleophile. The methox- Grignard reagents are good nucleophiles and will do a nucleophile attack on the electrophile, carbon ide will attack the carbonyl group because it is dioxide. This is because Grignard reagents are very a good electrophile. The intermediate is negunstable. Next, the lone pair on the negatively atively charged now and needs to become charged oxygen will grab the proton in an acid neutral and more stable, so the lone pair workup step and form the product reforms the double bond and kicks off the chloride, which is a good leaving group to form the product Methoxide is a good nucleophile because it has a negative charge (i.e., excess electron density). The polarized C=O makes the carbon in the carbonyl group very electrophilic. The negative charge on the methoxide is attracted to this partial positive charge and will undergo a nucleophilic attack to form the tetrahedral intermediate. Reformation of the C=O is a strong driving force (lower in energy), and the chloride will leave because it is a good leaving group since it is polarizable and can better hold a negative charge. This will lead to the formation of the product

Grignard reagents can be thought either as ionic with a carbanion and magnesium cation or as covalent with a partially negative carbon and partially positive magnesium. Either way, the carbon in the Grignard reagent is very electronegative. Carbon dioxide has a carbonyl which means that the bond is polarized because of the more electronegative oxygen giving carbon a partial positive and oxygen a partial negative. The electronegative carbon in the Grignard reagent is attracted to the electropositive carbon in carbon dioxide and will undergo a nucleophilic attack. In the acid workup, the lone pair on the oxygen anion will attack the partial positive proton in HCl to form the product

Chapter 18

Absent ●● No response ●● Non-normative Descriptive ●● Describes what is happening in the reaction ●● Simplistic description of bond formation and breaking processes ●● Does not address why the reaction is occurring Foundational ●● Describes why the reaction is occurring at a surface level ●● Explicit features are mentioned ●● Implicit features may be mentioned but not fully explained ●● Example reasons: stability and leaving group ability Complex ●● Describes why the reaction at a deeper level ●● Explicit features are used to infer implicit features that are sufficiently explained ●● Electronic effects are described ●● Example reasons: electron density, electronegativity, and partial charges

Example 1

314

Description of level

Generalizable Framework for Machine Learning-based Evaluation

315

be willing to report their development of such rubrics that can be adapted and adopted for use by other educators. Thus, a potential repository of such rubrics is envisioned; for example, such rubrics could be posted on OrganiCERS (www.OrganiCERS.org), an online repository of assessment items, classrooms activities, and other educational resources freely available to higher education faculty members teaching organic chemistry. Second, use of an overarching framework for evaluating reaction mechanism understanding within an organic chemistry course provides a new means for conducting formative and summative evaluation. We know that assessment drives learning.7,26,64 In line with best practices for using rubrics to evaluate learning,65 sharing these rubrics with learners for use in their studies, has the potential to further impact learning; learners will have examples for what is expected. Additionally, as our research group continues to develop new models and means for disseminating those models such as APIs (application programming interfaces) that will immediately score responses, we expect new opportunities will emerge to measure reaction mechanism understanding through written assessment items in the context of a classroom experience.15 Third, given the overarching focus of promoting understanding of reaction mechanisms in the organic chemistry curriculum, assessment data collected using such rubrics provide a point of reflection when considering how instruction and curricula can be improved and transformed.

18.7  Implications for Researchers Similarly, implications for researchers are three-fold. First, as has hopefully been apparent in this chapter, our research group has interests in further operationalizing the levels of explanation sophistication framework for use in developing more predictive models. Not only are these tools needed, but examples of how to use the assessment tools and the value of those tools need to be shared, evaluated, and continuously evolved. Second, through our Lewis acid–base model development work, we recognize that some types of algorithms need chemistry-specific text analysis dictionaries including stop words and synonyms/replacements. While our dictionaries are already publicly available,46 we hope that we, and others, will continue to develop such resources for the chemistry education community. There is a real opportunity for the development of open education resources using machine learning technology. Third, there is a need to evaluate the innovative, research-based organic chemistry curricula that have emerged in the last decade. Often the tools developed to measure effectiveness are created by those who are developing and disseminating the new curricula;1,20,25,33,58,66–71 this is important! At the same time, having research-based and research-informed assessment tools that emerge outside those curricula provide a different means for demonstrating effectiveness.31

316

Chapter 18

18.8  A Path toward Better Learning The levels of explanation sophistication framework offered in this chapter provides a new means to consider assessment of reaction mechanism understanding. While the research literature is bleak as to the meaning learners ascribe to reaction mechanisms,2,4,24 the research literature of how such meaning and understanding is developed is promising: namely, through learning experiences that mirror the practice of chemistry.6,12,22 Thus, to learn reaction mechanisms, a learner must engage in describing what is happening in a reaction mechanism and why, rather than just drawing a mechanistic picture.1,6,26,33 Learners and educators need to consider instruction and assessment from the same overarching, broader understanding of reaction mechanisms that practicing chemists use.7–9,11,12 This includes, for example, thinking about assessing understanding of electrophiles in the context of a series of reaction mechanisms rather than specifics for a particular, single reaction. Machine learning algorithms and models then provide the means from which automated text analysis tools can be used to develop predictive models that can be used throughout the classroom and broader assessment contexts.15,17 Such models provide the assessment tools to effectively evaluate understanding of reaction mechanisms across the organic chemistry curriculum through explanations. Drawing mechanistic pictures continues to be important to the organic chemistry educational experience.5,6 However, written explanations of those mechanistic pictures provide a better measure of learning and thus should have an equally important place in organic chemistry curricula.12,14,26

Acknowledgements We would like to thank the countless students who have provided explanations in the context of their organic chemistry courses at the University of South Florida (Institutional Review Board Application Pro#00028802); this work would not have been possible without their willingness to participate in the innovative learning experiences we provided them. Additionally, we would like to thank Daniel Cruz-Ramírez de Arellano, Frankie Costanza, and Kimberly B. Fields (University of South Florida) for providing access to their courses; the feedback provided by these educators continues to challenge and inform the future work outlined in this chapter.

References 1. O. M. Crandell, M. A. Lockhart and M. M. Cooper, J. Chem. Educ., 2020, 97, 313. 2. R. Ferguson and G. M. Bodner, Chem. Educ. Res. Pract., 2008, 9, 102. 3. D. Cruz-Ramírez de Arellano and M. H. Towns, Chem. Educ. Res. Pract., 2014, 15, 501.

Generalizable Framework for Machine Learning-based Evaluation

317

4. G. Bhattacharyya and M. S. Harris, J. Chem. Educ., 2018, 95, 366. 5. N. P. Grove, M. M. Cooper and E. L. Cox, J. Chem. Educ., 2012, 89, 850. 6. G. Bhattacharyya and G. M. Bodner, J. Chem. Educ., 2005, 82, 1402. 7. N. Graulich and M. Schween, J. Chem. Educ., 2018, 95, 376. 8. W. M. Goodwin, Found. Chem., 2008, 10, 117. 9. W. Goodwin, Ann. N. Y. Acad. Sci., 2003, 988, 141. 10. G. Bhattacharyya, J. Chem. Educ., 2013, 90, 1282. 11. J. R. Raker and M. H. Towns, Chem. Educ. Res. Pract., 2012, 13, 277. 12. N. E. Bodé, J. M. Deng and A. B. Flynn, J. Chem. Educ., 2019, 96, 1068. 13. F. M. Watts, I. Zaimi, D. Kranz, N. Graulich and G. V. Shultz, Chem. Educ. Res. Pract., 2021, 22, 364. 14. T. Gupte, F. M. Watts, J. A. Schmidt-McCormack, I. Zaimi, A. R. Gere and G. V. Shultz, Chem. Educ. Res. Pract., 2021, 22, 396. 15. A. J. Dood, K. B. Fields, D. Cruz-Ramírez de Arellano and J. R. Raker, Can. J. Chem., 2019, 97, 711. 16. B. J. Yik, A. J. Dood, D. Cruz-Ramírez de Arellano, K. B. Fields and J. R. Raker, Chem. Educ. Res. Pract., 2021, 22, 866. 17. A. J. Dood, J. C. Dood, D. Cruz-Ramírez de Arellano, K. B. Fields and J. R. Raker, Chem. Educ. Res. Pract., 2020, 21, 267. 18. A. J. Dood, K. B. Fields and J. R. Raker, J. Chem. Educ., 2018, 95, 1267. 19. A. J. Dood, J. C. Dood, D. Cruz-Ramírez de Arellano, K. B. Fields and J. R. Raker, J. Chem. Educ., 2020, 97, 3551. 20. A. B. Flynn and R. B. Featherstone, Chem. Educ. Res. Pract., 2017, 18, 64. 21. I. Caspari, D. Kranz and N. Graulich, Chem. Educ. Res. Pract., 2018, 19, 1117. 22. V. Talanquer, Chem. Educ. Res. Pract., 2018, 19, 998. 23. I. Caspari, M. L. Weinrich, H. Sevian and N. Graulich, Chem. Educ. Res. Pract., 2018, 19, 42. 24. N. P. Grove, M. M. Cooper and K. M. Rush, J. Chem. Educ., 2012, 89, 844. 25. M. M. Cooper, R. L. Stowe, O. M. Crandell and M. W. Klymkowsky, J. Chem. Educ., 2019, 96, 1858. 26. R. L. Stowe and M. M. Cooper, J. Chem. Educ., 2017, 94, 1852. 27. G. Bhattacharyya, Chem. Educ. Res. Pract., 2008, 9, 84. 28. M. M. Cooper, J. Chem. Educ., 2013, 90, 679. 29. J. R. Raker and M. H. Towns, Chem. Educ. Res. Pract., 2012, 13, 179. 30. M. C. Connor, B. H. Glass and G. V. Shultz, J. Chem. Educ., 2021, 98, 2786. 31. M. M. Cooper and R. L. Stowe, Chem. Rev., 2018, 118, 6053. 32. L. Lieber and N. Graulich, Chem. Educ. Res. Pract., 2022, 23, 38. 33. M. M. Cooper, H. Kouyoumdjian and S. M. Underwood, J. Chem. Educ., 2016, 93, 1703. 34. M. M. Cooper, J. Chem. Educ., 2020, 97, 903. 35. A. B. Flynn, J. Chem. Educ., 2011, 88, 1496. 36. K. Noyes, R. L. McKay, M. Neumann, K. C. Haudek and M. M. Cooper, J. Chem. Educ., 2020, 97, 3923. 37. X. Zhai, Y. Yin, J. W. Pellegrino, K. C. Haudek and L. Shi, Stud. Sci. Educ., 2020, 56, 111.

318

Chapter 18

38. M. Shiroda, J. D. Uhl, M. Urban-Lurain and K. C. Haudek, J. Sci. Educ. Technol., 2022, 31, 117. 39. M. Urban-Lurain, R. A. Moscarella, K. C. Haudek, E. Giese, D. F. Sibley and J. E. Merrill, 39th IEEE Frontiers in Education Conference, San Antonio, TX, 2009. 40. K. C. Haudek, J. J. Kaplan, J. Knight, T. Long, J. Merrill, A. Munn, R. Nehm, M. Smith and M. Urban-Lurain, CBE Life Sci. Educ., 2011, 10, 149. 41. X. Zhai, L. Shi and R. H. Nehm, J. Sci. Educ. Technol., 2021, 30, 361. 42. B. A. Winograd, A. J. Dood, R. Moeller, A. Moon, A. Gere and G. Shultz, in LAK21: 11th International Learning Analytics and Knowledge Conference, ACM, Irvine CA USA, 2021, pp. 586–591. 43. J. A. Schmidt-McCormack, J. A. Judge, K. Spahr, E. Yang, R. Pugh, A. Karlin, A. Sattar, B. C. Thompson, A. R. Gere and G. V. Shultz, Chem. Educ. Res. Pract., 2019, 20, 383. 44. T. Kwartler, Text Mining in Practice with R, Wiley, Hoboken, NJ, 1st edn, 2017. 45. I. Feinerer, K. Hornik and D. Meyer, J. Stat. Software, 2008, 25, 1, DOI: 10.17605/OSF.IO/TNBEV. 46. B. J. Yik and J. R. Raker, Lewis Acid–Base - R Files for Instructors and Researchers, 2021. 47. M. Lintean, V. Rus and R. Azevedo, Int. J. Artif. Intell. Educ., 2012, 21, 169. 48. K. Ramasubramanian and A. Singh, Machine Learning Using R, Apress, Berkeley, CA, 2019. 49. O. Sagi and L. Rokach, WIREs Data Mining Knowl. Discovery, 2018, 8, e1249. 50. J. Cohen, Educ. Psychol. Meas., 1960, 20, 37. 51. M. L. McHugh, Biochem. Med., 2012, 22, 276. 52. B. W. Matthews, Biochim. Biophys. Acta, Proteins Proteomics, 1975, 405, 442. 53. P. Baldi, S. Brunak, Y. Chauvin, C. A. F. Andersen and H. Nielsen, Bioinformatics, 2000, 16, 412. 54. D. Chicco and G. Jurman, BMC Genomics, 2020, 21, 6. 55. K. R. Galloway, M. W. Leung and A. B. Flynn, J. Chem. Educ., 2018, 95, 355. 56. D. Klein, Organic Chemistry, Wiley, 4th edn, 2021. 57. J. M. Deng and A. B. Flynn, Chem. Educ. Res. Pract., 2021, 22, 749. 58. O. M. Crandell, H. Kouyoumdjian, S. M. Underwood and M. M. Cooper, J. Chem. Educ., 2019, 96, 213. 59. N. Graulich and I. Caspari, Chem. Teach. Int., 2021, 3, 19. 60. N. Graulich, S. Hedtrich and R. Harzenetter, Chem. Educ. Res. Pract., 2019, 20, 924. 61. B. Bell and B. Cowie, Sci. Educ., 2001, 85, 536. 62. M. E. Anzovino and S. Lowery Bretz, Chem. Educ. Res. Pract., 2015, 16, 797. 63. M. E. Anzovino and S. L. Bretz, Chem. Educ. Res. Pract., 2016, 17, 1019. 64. T. Holme, S. L. Bretz, M. Cooper, J. Lewis, P. Paek, N. Pienta, A. Stacy, R. Stevens and M. Towns, Chem. Educ. Res. Pract., 2010, 11, 92.

Generalizable Framework for Machine Learning-based Evaluation

319

65. S. M. Brookhart, How to Create and Use Rubrics for Formative Assessment and Grading, Association for Supervision & Curriculum Development, 2013. 66. D. M. Webber and A. B. Flynn, J. Chem. Educ., 2018, 95, 1451. 67. A. B. Flynn and W. W. Ogilvie, J. Chem. Educ., 2015, 92, 803. 68. S. K. Houchlei, R. R. Bloch and M. M. Cooper, J. Chem. Educ., 2021, 98, 2751. 69. A. R. Straumanis and S. M. Ruder, J. Chem. Educ., 2009, 86, 1389. 70. M. A. Lipton, J. Chem. Educ., 2020, 97, 960. 71. K. R. Galloway, C. Stoyanovich and A. B. Flynn, Chem. Educ. Res. Pract., 2017, 18, 353.

Chapter 19

The Central Importance of Assessing “Doing Science” to Research and Instruction† Cara E. Schwarza, Kimberly S. DeGloppera, Aubrey J. Ellisona, Brian J. Esselmana and Ryan L. Stowe*a a

Department of Chemistry, University of Wisconsin, Madison, 1101 University Avenue, Madison, WI 53706, USA *E-mail: [email protected]

19.1  Introduction Assessment, that is the process of inferring what students know and can do from what they create, perform, write, or say, is central to science learning environments and to the study of those environments. Instructors cannot know exactly what students are thinking when they interact with one another or written tasks. As such, we must interpret evidence elicited by assessment tasks or instructional interactions and subsequently use this evidence to construct arguments related to learning.1 These arguments can, in turn, be used to support alterations to the learning environment in response to student thinking, provide feedback to learners about their understanding, or evaluate learners relative to some external standard. Additionally, in science education and chemistry education scholarship, responses to assessments †

Electronic supplementary information (ESI) available: Table showing codes capable of describing our dataset and example student responses. See DOI: 10.1039/9781839167782

  Advances in Chemistry Education Series No. 10 Student Reasoning in Organic Chemistry: Research Advances and Evidence-based Instructional Practices Edited by Nicole Graulich and Ginger Shultz © The Royal Society of Chemistry 2023 Published by the Royal Society of Chemistry, www.rsc.org

320

The Central Importance of Assessing “Doing Science” to Research and Instruction

321

may be used to evaluate the efficacy of an intervention or transformation, or to show that an alteration to a learning environment was not detrimental. Furthermore, assessments define “success” to students, instructors, and researchers and thus play a powerful role in signaling course (or study) priorities. There is ample evidence that students engage with and respond to messages sent from assessments about the sort of intellectual work that is valued.2 If instructors say a class is about “learning how to learn” and becoming a scientifically literate citizen3 and then proceed to mainly assess decontextualized skills and recall, the course will be about rote skill performance and memorization in the eyes of many students. Accordingly, it is vitally important that instructors and researchers carefully consider the sort of intellectual work that is worth emphasizing and rewarding and create learning environments that support this work. Here, we focus our attention on written tasks incorporated in assessments that can be used to infer engagement in valued intellectual work. We note that these are one part (albeit an important part) of a curricular ecosystem that emphasizes students applying models to rationalize chemical phenomena, and coherent alignment of multiple curricular components will be required to support successful engagement with the sorts of tasks we describe in this chapter.4

19.2  Assessment 101 We ascribe to the consensus view, as espoused by a committee of cognitive scientists, sociologists, and assessment experts assembled by the National Academies of Sciences in Knowing What Students Know,1 that assessment is a process of reasoning from evidence. Assessments are used to collect data in the form of students’ responses, from which we make inferences about student learning. A theory of cognition guides the design of assessment items and informs the sorts of inferences that may be reasonably drawn from student response data. Knowing What Students Know represents “assessment as argument from evidence” using three interconnected corners of an “assessment triangle.” The consensus study authors argue that assessment designers should take into account: “(1) a model of student cognition and learning in the domain, (2) the kinds of observations that will provide evidence of students’ competencies, and (3) an interpretation process for making sense of the evidence.”1 In the sections that follow, we will unpack each corner of the assessment triangle, as well as how each construct relates to the other two, in more detail and provide illustrative examples.

19.2.1  Observation The nature and value of each observation is determined by the types of the responses that might be elicited by assessment items. Depending on the prompt or assessment design, an observation might be a multiple-choice selection, a hand-drawn diagram, an essay, or some combination of the

Chapter 19

322

Figure 19.1  A  common introductory organic chemistry learning objective and two possible assessment prompts.5

three. Assessment tasks should be designed to elicit sufficient observations for evaluating student progress toward defined learning objectives. For example, consider the prospect of assessing the learning goal shown in Figure 19.1. Commonly encountered prompts, such as Prompt A, in which students are asked to predict the major product of a reaction system, do not have the potential to elicit evidence of progress toward this objective. An appropriate answer to such a prompt contains only a claimed major product—no explanation for why that product might be preferred over other alternatives is asked for. In contrast, a prompt like Prompt B, which asks for a short explanation as to why the predicted major product is formed, has the potential to provide considerably more evidence regarding why the student predicted a certain regiochemical outcome.5 When designing assessments, whether in the context of research or instruction, it is important to construct assessment items that can elicit useful observations.6

19.2.2  Interpretation Because student learning cannot be measured directly, it must be inferred from students’ responses to assessments (i.e., observations). To do this, several assumptions must be made, not all of which are equally reasonable. Returning to the alkene hydrobromination example (see Figure 19.1), it would be unreasonable to infer that a student understands why the major product has a bromine at the benzylic position simply because they drew the correct product distribution or communicated the correct electron-pushing mechanism. Many studies have demonstrated that students are capable of predicting products and drawing mechanisms without understanding how and why product formation is reasonable.7–10 However, a reasonable mechanism accompanied by an appropriate mechanistic explanation8,11,12 (e.g., the intermediate leading to Product A is more stable than the intermediate leading to Product B due to increased charge delocalization via π conjugation)

The Central Importance of Assessing “Doing Science” to Research and Instruction

323

may be reasonably interpreted as evidence that students understand how and why the focal phenomenon happened. Accordingly, prompts which ask for coupled mechanisms and explanations enable instructors and researchers to more confidently describe how (or whether) students connect molecular behavior to an observable event in the context of an assessment. These coupled prompts also allow instructors to separately award credit for the mechanism and the explanation signaling to students the value of each portion of the answer. One must be mindful of the extent to which the inferences about student thinking are supported by the observations elicited.

19.2.3  Conceptual Change A mechanistic model of how learning occurs, represented by a theory of cognition, should provide insight into the process of learning, and inform both education research and practice. A theory of cognition describes how knowledge is “represented, organized, and processed in the mind.”1 Before theories of cognition were developed, education research relied on behaviorism,13 which focused only on observable actions, and treated cognition as a black box. Regarding research, a theory of cognition should affect how a study is designed, how data is interpreted, and what conclusions may be reasonably made. While for instruction, it should affect the design of curricula and learning materials, the emphasis of day-to-day student and instructor activities, and how assessments are structured and interpreted. In essence, one’s theory of cognition, whether implicit or explicit, determines how one frames assessment of learning. Here, we will describe two “classes” of learning models, which represent two ends of a continuum. This continuum is defined by assumptions about the stability of learners’ knowledge. On one end is a “theory–theory” model, in which knowledge structures are assumed to be stable across time and place. On the other end is a “knowledge-in-pieces” model, in which knowledge is assumed to represent in-the-moment connections between smallgrain ideas.14 These theories are domain-independent models; more detailed models of cognition may exist for specific topics and skills. Fundamental assumptions about learning should be made explicit in studies of chemistry learning as they inform assessment design and interpretation.15,16 A theory–theory model of cognition describes knowledge as existing in coherent, “theory-like” structures in the minds of learners.17 These structures are assumed to be stable across contexts. Early theory–theory models suppose that learning can proceed via one of two avenues—assimilation or accommodation. New information that is consistent with prior knowledge can be easily assimilated into existing knowledge structures. However, if the new information is contradictory or inconsistent with prior knowledge, a person must undergo accommodation. Strike and Posner18 stated that for accommodation, the person must first experience some dissatisfaction caused by the incongruence between the new concept and their existing “theory”. The sense of dissatisfaction generated by the conflict between prior

324

Chapter 19

knowledge and new information motivates them to abandon their “wrong theory”. The new concept will be taken up as a new “theory-like” structure, but only if it is intelligible, seems plausible, and has the potential to expand their knowledge. Put simply, it must make more sense than previous conceptions. Note that Strike and Posner present conceptual change as a fundamentally rational process that, in many ways, parallels paradigm shifts observed in science.17,19 On the other end of the stability spectrum is the knowledge-in-pieces model. According to this model, rather than existing in large-grain, coherent structures, knowledge exists in elements of varying grain sizes that are activated and connected in the moment as needed. The smallest of these elements, termed by diSessa as “phenomenological primitives” or “p-prims” for short, are intuitive, context independent, and derived from everyday experience.20,21 For example, the p-prim “more means more” can be used to explain why pushing a heavier object requires more effort. Hammer and Elby22 use the more general term “resource” which encompasses knowledge elements of varying grain size and type. Importantly, a given resource is never inherently “correct” or “incorrect”—whether a resource is useful depends on the context in which it is activated. For example, “more means more” is handy when moving a couch but a bit less useful when activated as part of the claim “the substance composed of heavier molecules will always have the higher boiling point.” Viewed through a knowledge-in-pieces model of cognition, learning involves acquiring resources and recognizing the situations for which they are useful. It should be noted that a “knowledge-in-pieces” view does not preclude stability. Resources which are regularly activated and connected may, over time, be compiled into a single resource that can be called on with little effort.23 The key differentiator between “theory–theory” and “knowledge-in-pieces” models of cognition is that a “knowledge-in-pieces” view does not assume stability in the absence of evidence.

19.2.4  H  ow Observation, Interpretation, and Cognition Work Together As stated previously, each vertex of the assessment triangle interacts strongly with the other two vertices. A theory of cognition influences the types of observations useful for inferring learning and the assumptions underlying interpretation. Observations provide the basis for interpretation and conclusions regarding what students know and can do. Interpretation is the process by which students’ writings or drawings are seen to represent snapshots of learning in each moment. To illustrate how observation, interpretation, and cognition interact, consider the assessment of students’ understanding of how and why boiling occurs reported by Cooper et al.24 In this interview study, students were presented with three pairs of molecules (ethane/ethanol, methanol/ethanol, and dimethyl ether/ethanol) and asked to predict and explain which substance in each pair had the higher boiling point. Each pair of molecules was meant to elicit a distinct set of inferences from students.

The Central Importance of Assessing “Doing Science” to Research and Instruction

325

That is, some of the ideas required to explain why ethanol boils at a higher temperature than methanol differ from those required to explain the boiling point difference between dimethyl ether and ethanol. Cooper and colleagues observed significant flexibility and variability in student thinking across the contexts presented in their interview. For example, when comparing ethane and ethanol, one student initially stated that ethanol would have a lower boiling point than ethane because ethanol has an oxygen atom. The interviewer asked the student to draw the Lewis structure of ethanol and the student revised their prediction and explanation to include hydrogen bonding. From a theory–theory perspective, this change in reasoning cued by a prompt to draw a Lewis Structure cannot be readily explained, and it is unclear how to interpret the student’s two responses. From a knowledge-in-pieces perspective, however, it seems that they initially did not activate resources related to hydrogen bonding, but a cue from the environment (i.e., the encouragement to draw a Lewis structure and the likelihood that it is relevant) prompted activation of additional resources that coalesced to form a new explanation. A reasonable interpretation would be that this student possesses productive ideas for reasoning about boiling points, but that they need more practice in readily identifying contexts for which these ideas are appropriate. We advocate for a resources perspective on cognition as it is more consistent with available evidence and allows us to take an asset view of students and their prior knowledge and experiences.22,24–28 With a resources view of cognition; we can approach instruction with the goal of helping students refine the ways in which they connect knowledge elements to make sense of phenomena rather than trying to replace “misconceptions” with “correct” ideas. Students have many useful ideas that can be applied to help them understand how and why molecular behavior leads to things they can observe. However, because molecular behavior is often counterintuitive, students need to time and practice to get a sense of what ideas are productive in which contexts. This is not as straightforward as providing students the “correct” answer after they have constructed an answer that does not align with canon, as the theory–theory view may suggest.

19.3  A  ssessing Work Aligned with the Practice of Chemistry We argue that chemistry coursework should emphasize and reward (i.e., assess) intellectual work aligned with the practice of chemistry.6,29,30 “Doing chemistry” entails constructing, critiquing, refining, and communicating causal accounts for how the world works and designing solutions to pressing problems.14,31 Due to the invisible actors that underlie phenomena of interest to chemists, construction of these causal accounts very often involves simplifying the system under study and attending to how and why ensembles of molecular entities connect to an observable event.11,12 Connecting how and why interactions between ensembles of molecules result

326

Chapter 19

in chemical phenomena requires weaving together a variety of large-grain ideas (e.g., energy, bonding interactions, Coulombic interactions)32 to articulate the properties of molecular entities in a system and interactions these entities are likely to engage in. The National Academies’ Framework for K-12 Science Education calls this sort of performance “three-dimensional” (or 3D) as it requires using knowledge of fundamental ideas in a practice characteristic of science as framed by a cross-cutting lens.14 Although the Framework was developed for K-12 science education, the literature that underpins it is equally applicable to higher education settings.33 Assessments which have the potential to elicit evidence of students making sense of a phenomenon in terms of large-grain ideas are often referred to as “3D assessments” by the academies and scholarships related to 3D learning.34 Several scholarly works have described the characteristics of 3D tasks that might be given in introductory and organic chemistry.6,29,30,35 3D tasks must be grounded in an observable event or perplexing phenomenon, which give purpose to students’ explanations, models, and arguments. Skills associated with “chemistry class” become decontextualized and artificial if not anchored to an occurrence in the natural world. No practicing scientist draws random Newman projections for the sake of demonstrating proficiency in their artistic ability. Rather, Newman projections are authentically used by practicing organic chemists to help frame energetic arguments where a particular conformation may influence an observable outcome. Skills in depicting molecules only have utility if they are used to describe some aspect of a physical system in order that this system may be understood or manipulated. 3D tasks also must have the potential to elicit evidence that students can connect “big ideas” to phenomena via engagement in science practices (e.g., developing and using models, arguing from evidence). Other chapters in this volume are focused on unpacking particular practices (e.g., mechanistic explanations). To characterize the potential of a given task to elicit evidence of engagement in a 3D performance, Laverty and colleagues created an instrument called the “3D Learning Assessment Protocol” or 3D-LAP.36 This instrument provides criteria an item must meet to have the possibility of eliciting evidence of engagement in a science practice, fundamental disciplinary idea, or crosscutting concept. An example of the criteria for the science practice “Developing and Using Models” is shown in Table 19.1. Importantly, items which are “3D” according to the 3D-LAP have the potential to elicit evidence of engagement in a 3D performance. To determine whether a given item meets this potential, however, further calibration of the prompt structure is required, which could be done through cognitive interviews. The 3D-LAP has been used by us and others to describe how assessments changed as the result of course transformations in introductory chemistry, biology, and physics;37 suggest modifications to “traditional” tasks;35 and characterize how different types of reformed chemistry learning environments operationalize “success”.30 An example item that has the potential to elicit evidence of student engagement in using models can be found in Figure 19.2. This item asks students to

The Central Importance of Assessing “Doing Science” to Research and Instruction

327

Table 19.1  The  3D-LAP criteria for an assessment item to have the potential to elicit evidence of engagement in the scientific practice of developing and using models.36

Developing and using models—3D-LAP criteria Question gives an event, observation, or phenomenon for the student to explain or make a prediction about Question gives a representation or asks student to construct a representation Question asks student to explain or make a prediction about the event, observation, or phenomenon Question asks student to provide the reasoning that links the representation to their explanation or prediction

predict and explain whether nylon or polyglycolic acid is used to make surgical thread which is readily absorbed into the body. Sutures dissolving (or not) is clearly an observable phenomenon for anyone who has surgery, so the first criterion for an item to be considered “3D” is met (i.e., the item is centered on a phenomenon). Students are asked to represent the relative activation energy of tetrahedral intermediate formation in polyamide and polyester systems and subsequently use this representation to predict which substance they would expect to degrade via hydrolysis more easily in the body. Students finally justify their prediction by connecting the extent to which surgical thread is “broken down” to the energy barrier of the hydrolysis reaction. The prompt in Figure 19.2 meets 3D-LAP criteria for “developing and using models”,36 requires explicit invocation of big ideas such as “energy” and “bonding interactions”,32 and asks for consideration of how structure relates to properties (the crosscutting concept “structure and function”). As noted previously, what instructors emphasize and reward on class assessments sends strong implicit and explicit messages to students about what type of learning has value. Homework, quizzes, and exams (if present in a course) should all place substantial emphasis on 3D tasks such as that shown in Figure 19.2. While we do not have evidence as to how the relative emphasis of 3D performances relates to messages students receive about valued knowledge processes and products, we are comfortable stating the following: if one can get an A in a course without constructing causal accounts for chemical phenomena, there is not enough emphasis on 3D tasks in that course.30 Transformed introductory and organic chemistry courses informed by literature on 3D learning have historically allotted 30–50% of points on exams to items that are described as “3D” by application of the 3D-LAP.29,30,37 We are of the opinion that, if time and energy limitations did not exist, one cannot have too much emphasis on 3D performances. Skill-based tasks (e.g., drawing chair conformations or PE surfaces) can be nested within 3D tasks in order that the skills have a clear purpose and scaffold students’ written explanations in an authentic manner. Accordingly, our organic chemistry instructional team aims to create quizzes and exams that overwhelmingly emphasize predicting, explaining, and modeling phenomena (i.e., >70% of total points). The common objection that, “students cannot handle too

328

Chapter 19

Figure 19.2  A  multi-part prompt with the potential to engage students in developing and using models to explain observable properties. Portions of the prompt that meet the 3D-LAP criteria for developing and using models are highlighted. Prompt spacing has been condensed from what was given to students.

The Central Importance of Assessing “Doing Science” to Research and Instruction

329

many explanation- or model-focused tasks,” is not supported by any evidence. When ample course time and resources are available even very young learners can productively engage in sophisticated sensemaking work.38

19.4  3D Assessments as Research Tools 3D tasks have the potential to elicit nuanced evidence about what students know and can do when asked to make sense of phenomena. As such, they should be viewed as potentially powerful research tools useful for assessing the efficacy of learning environment reforms. Indeed, there are several published reports in which claims about learning were supported, in whole or in part, by student responses to 3D items.39–44 In these studies, student responses to open-ended, 3D tasks were characterized using codes that described the ideas called to mind and connected in the context of the prompt. Importantly, virtually all who use 3D tasks as research instruments (including us) ascribe to a knowledge-in-pieces view of cognition. Thus, responses to a given item cannot support inferences about a durable understanding or misunderstanding. We present some recent work from our group5 as one example of how responses to a multi-part 3D task might be used to infer the characteristics of learning environments that support students in explaining phenomena. This study occurred in the context of a second-semester organic chemistry course taught during the spring of 2021. Course sections under study used the same textbook, followed approximately the same order of topics, and had similar course structure. The most substantive difference between course sections was the extent to which they emphasized 3D tasks on assessments—one course placed substantial emphasis on such tasks (∼50% or more of points on exams) while the others placed markedly less emphasis on students explaining how and why phenomena happen. Students in all sections were given a prompt that asked them to evaluate a claim as to the outcome of addition of HBr to a benzylic alkene (see Figure 19.1, Prompt B). Our team examined responses to this prompt to determine (1) what ideas related to structure and energy were activated, and (2) whether enrolling in a learning environment with a substantial emphasis on 3D tasks was associated with more productive connection of structure and energy ideas in prompt responses. To parse more- or less-productive ideas and connections, we attended to whether students invoked the structure of reactants, products, and/or intermediates and whether structural accounts were connected to claims related to energy (e.g., the benzylic carbocation is lower in energy due to charge delocalization via π conjugation). A table showing codes capable of describing our dataset and example student responses may be found in the Electronic Supplementary Information. The codes that describe student responses to 3D assessment tasks can be used as outcome variables in cross-sectional studies. Such an analytic strategy enables a far more nuanced description of student thinking than typical outcome measures such as DFW rates, course grades, and scores on

330

Chapter 19

instructor authored exams. For example, in our previously mentioned study, we were able to demonstrate that enrollment in a learning environment that substantially emphasized 3D tasks was associated with productively connecting the structure of relevant intermediates or transition states to their relative energies.5 More coarse-grained outcome measures (such as DFW rates and course grades) support claims about the percentage of students who succeeded in a course but cannot enable inferences regarding what “success” means. If exams emphasize solely skill- or algorithm-based questions, such as nomenclature or drawing structural representations (e.g., chairs, Newman projections) without any context or explanation, improvement in exam scores says nothing about how well students were supported in “doing science”. Information about the sorts of tasks emphasized and rewarded on assessments is needed to contextualize reported exam score or course grade improvements.

19.5  3  D Assessments as a Vital Part of 3D Learning Environments Despite 3D assessments having the potential to provide strong evidence of what students know and can do with their knowledge, their use in chemistry courses is very limited. Assessments in many courses feature problems that require students to perform skills or calculations without connecting the resultant drawing or number to chemical phenomena or asking students to explain how or why something is happening. Stowe and Cooper6 found that 93% of points on a sample of 15 organic chemistry exams from highlyranked universities were allotted to items that did not involve the use of scientific practices. 3D questions are also exceedingly rare on most standardized exams, such as the ACS exams, which are used broadly as measures of success in organic chemistry courses.6 As we noted previously, assessments which embed no 3D questions are not capable of eliciting evidence of student engagement in connecting molecular behavior to how the world works. Further, students enrolled in courses which never require construction of causal accounts for phenomena are unlikely to see “making sense of the world” as a valued knowledge construction process. There are a handful of organic chemistry learning environments that have consistently integrated 3D tasks into their courses. Two examples are OCLUE (Organic Chemistry, Life, the Universe, and Everything)45 at Michigan State University and select sections of the introductory and intermediate organic chemistry courses at the University of Wisconsin-Madison.5,29 In each of these learning environments, students are provided with numerous opportunities to practice predicting and explaining chemical phenomena discussed in-class through homework assignments and small group discussion activities. At UW, most problem sets and discussion activities contain at least one 3D task,29 in which students are asked to predict and/or explain a phenomenon. Additionally, throughout the second semester course, there are numerous problems that are not 3D but still involve scientific practices such

The Central Importance of Assessing “Doing Science” to Research and Instruction

331

as analyzing and interpreting data and arguing from evidence. The most common examples of these are spectroscopic analysis problems where students must analyze multiple types of spectroscopic evidence (1H-NMR, 13CNMR, EI-MS, and IR) to determine the structure of an unknown or a reaction outcome.46 Spectroscopic analysis is certainly an important skill in organic chemistry, but because solving spectra does not require the use of core ideas, such as energy or electrostatic and bonding interactions, questions of this type are not 3D. Ideally, argumentation from spectroscopic evidence would be coupled with prompts asking students to explain, for example, why the product distribution indicated by the evidence came about. Such multi-part prompts make explicit the reality that “doing science” involves an ensemble of science practices. It should be acknowledged that writing (and grading) questions which ask students to link big ideas to why phenomena happen is a significant undertaking. There are several published resources that can assist instructors wishing to increase the extent to which their courses emphasize 3D learning. For example, the 3D-LAP36 can assist in revising existing items in order that they have the potential to elicit evidence of students using big ideas in science practices.35 Recall that the 3D-LAP provides criteria that a task must meet in order for it to have the potential to elicit evidence of 3D learning through engagement in scientific practices, core ideas, and crosscutting concepts. Existing tasks may meet some, but not all the criteria given—for example, one could imagine giving a prompt that consists solely of part A shown in Figure 19.2. Such an item is contextualized by a phenomenon and asks for depiction of a pair of potential energy surfaces. However, without part B shown in the figure, the item would not have the potential to elicit any evidence that the students vested their representations with meaning. Addition of a prompt asking students to unpack the reasoning underpinning their representation (e.g., mechanism, PE surface, transition state rendering) is a great way to create 3D tasks from existing items. To support students in making sense of increasingly complex systems, 3D tasks should grow in complexity over the course of a semester or year of instruction. We illustrate how 3D tasks might increase in sophistication over time using a pair of tasks given at UW-Madison (see Figure 19.3). The first task is from the first exam in first semester organic chemistry, while the second task is from the third exam in the second semester course. Both questions require students to connect ideas about the structures of molecules to their relative energies. In the second example, however, students must consider a variety of factors to explain not only the experimentally observed equilibrium, but also why hypothetical rotation and locking of the C–N bond in place would lead to equilibrium favoring the opposite side of the reaction. In the first question, students are asked to draw Newman projections for conformers of butane and rationalize the relative energy difference between two eclipsed conformations (an example covered in class, in the textbook, and in practice problems). The next part of the prompt is about a molecule (1,2-difluoroethane) for which the students have not studied the conformational potential energy surface. To help students attend to the salient parts of

332

Chapter 19

Figure 19.3  Two  example 3D prompts focused on modeling and explanations

using core ideas such as energy, structure–property relationships, electrostatic and bonding interactions, and change and stability. Prompt spacing has been condensed from what was given to students.

The Central Importance of Assessing “Doing Science” to Research and Instruction

333

this new phenomenon, they are explicitly prompted to consider the expected steric interactions for 1,2-difluoroethane. The second question, given near the end of the second semester course, provides less information to the students about what ideas they should incorporate into their explanations. They are asked to describe structural features that lead to one side of the equilibrium being favored in each case but are not told exactly what structural features to consider. The nitrogen lone pair orbital images are provided to help students with visualizing the electronic structure of the molecule, but the lone pair is just one factor that is relevant to the question. Students could reference a handful of others. While full curricular transformations may not be feasible or desirable for all instructors wishing to incorporate more 3D learning in their courses, changing the emphasis of student work and assessments can have a positive impact on students’ abilities to explain phenomena of interest. Certainly, it is not necessary or practical to try to jump from having assessments with no 3D questions to only having 3D questions. Nor is it desirable to have substantial 3D emphasis on assessments without substantial 3D emphasis on practice materials. Instructors can build in 3D learning opportunities and 3D assessments over time, thus creating a rich learning environment without dramatic changes between iterations of the course. Having one or two 3D questions on all assessments and practice materials would provide students an opportunity to do work more associated with the work of chemists and show the ways that they can use their knowledge and understanding of atomic/molecular behavior to make sense of phenomena.

19.6  F  uture Directions for Research on 3D Assessments While 3D assessments support students in learning the practices characteristic of science, they do not necessarily support students in adopting the underlying epistemologies that guide scientific practices.47 The primary reason that scientists develop models, analyze data, and construct explanations is to make sense of puzzling phenomena and generate knowledge. Students may at times be motivated by a similar desire, but, as anyone who has taught an introductory chemistry course has no doubt observed, students are often driven by a desire to construct a response the instructor views as “correct”. Importantly, rapidly producing an answer that earns maximum points is a distinctly different knowledge construction goal from generating a causal account for a perplexing event that is consistent with observation and prior knowledge. The first goal exemplifies the “school game” in which the instructor is the ultimate arbiter of what matters (and what does not).48 The second goal (i.e., “figuring out”) is shaped by a recognized gap in understanding; an external standard of “correctness” is not important to achievement of this goal. Instead, learners focused on “figuring out” wish to “ascertain the mechanism underlying a phenomenon in order to resolve a gap or inconsistency in (their) understanding”.49

334

Chapter 19

Borrowing from psychology, sociology, and linguistics, education researchers refer to students’ sense of what is going on as their frame,50,51 and it is based in large part on prior experience. Regarding assessment, we are most concerned with a student’s epistemic frame, which orients their approach to knowledge. Students can adopt a variety of epistemic frames depending on their aims for knowledge use and construction, the processes they perceive as reliable for achieving those aims, and the criteria by which they evaluate knowledge. For example, a student might believe their goal is to provide the correct answer, in which case they might focus on recalling information from the textbook or professor since they trust knowledge from authority. Alternatively, a student who aims to provide an explanation that makes sense to them may draw on everyday experiences and seek an answer that is logical and consistent with their prior knowledge. Note that frames are not necessarily stable and may change moment to moment. We contend that some frames have greater utility to learners than others. That is, some answers to the question “how should I approach knowledge?” are likely to be more useful outside of school than other alternatives. For example, a frame in which the goal is to rapidly recall isolated facts and/or perform discrete skills is unlikely to be cued once students leave the confines of school. We align with many other science education reform efforts in claiming that figuring out how and why the world works as it does is a knowledge construction goal that authentically reflects “doing science” and has potential to contribute to learners’ use of scientific ideas and practices in their post-school life.14,31,49,52 Odden and Russ synthesized a substantial literature base to define “sensemaking”, in part, as a stance toward learning focused on figuring out mechanisms underpinning perplexing phenomena.49 As such, we consider sustained adoption of a “sensemaking frame” to be a central goal of science learning environments. Characteristics of a sensemaking frame that are evident in a dialogic context (e.g., interview setting, classroom discussion) include argumentation, connections to everyday knowledge, and iterative explanation-building.49 This is in contrast to frames such as brainstorming, in which the aim is to recall knowledge; oral examination, in which the aim is to produce the correct answer; and expert interview, in which the aim is to communicate knowledge.53 In addition to literature on how to identify sensemaking and other frames, there has also been some work on how to shift or stabilize frames. Many factors can shift a student’s frame, including the wording of a question,8 instructor interventions,54 and interactions with their peers.55 With regard to a sensemaking frame specifically, Odden and Russ53 found that “vexing questions” can help initiate and sustain sensemaking frames when learners are considering relatable scenarios explainable using physics. A question is “vexing” when the person asking it does not readily know the answer, is dissatisfied with their lack of knowledge, and therefore is motivated to figure out an answer. However, prompting students to consider what they learned in school can discourage a sensemaking frame and instead shift students into a brainstorming or oral examination frame.

The Central Importance of Assessing “Doing Science” to Research and Instruction

335

Most research on students’ sensemaking frames utilizes interviews to gather data; few56 have focused on how written assessments affect framing. It is relatively easy to see how the use of vexing questions could be incorporated into classroom discussions. However, it remains unclear as to how (or even if) written assessments can be designed to stabilize a sensemaking frame over an extended period. The relationship between assessment emphasis and framing is underexplored and in need of further study.

19.7  Conclusion We hope we have persuaded the reader that 3D assessments are potentially powerful tools for eliciting evidence of student engagement in “doing science”. Responses to well-calibrated 3D prompts can support inferences about the sorts of ideas and connections students find useful for constructing, critiquing, or revising causal accounts for phenomena. Whether you are engaging primarily in teaching or in research, it is vital you think carefully about the intellectual work emphasized and rewarded on assessments. Assessments send implicit and explicit messages to students about what matters—that is, what “success” means in each course or study. As instructors, we should ensure that our assessments align with the performances that we think are important (tasks involving predicting and/or explaining chemical phenomena in terms of core ideas, we argue). As researchers, we should use outcome measures that are capable of eliciting evidence of engagement in valued performances and interpret responses to these measures using modern models of learning. 3D tasks can provide more nuanced data on learning than simply noting students’ course or exam grade and so, we argue, should feature prominently in chemistry education scholarship. Unfortunately, written assessment responses alone do not provide us with evidence about how students frame their experiences with 3D tasks. If our goal is for learning environments to support adoption of sensemaking frames for extended durations, the community needs to figure out how learning environment features (including assessments) can effectively message that figuring out why things happen is the goal of class work (not recitation of canon).

Acknowledgements Support for this work was provided by the Office of the Vice-Chancellor for Research and Graduate Study at the University of Wisconsin, Madison, with funding from the Wisconsin Alumni Research Foundation.

References 1. National Research Council, Knowing What Students Know: The Science and Design of Educational Assessment, National Academies Press, Washington, D.C., 2001. 2. R. S. Russ, J. Res. Sci. Teach., 2018, 55, 94–120.

336

Chapter 19

3. N. Feinstein, Sci. Educ., 2011, 95, 168–185. 4. J. Biggs, Higher Educ., 1996, 32, 347–364. 5. K. S. DeGlopper, C. E. Schwarz, N. J. Ellias and R. L. Stowe, J. Chem. Educ., 2022, 99, 1368–1382. 6. R. L. Stowe and M. M. Cooper, J. Chem. Educ., 2017, 94, 1852–1859. 7. I. Caspari, D. Kranz and N. Graulich, Chem. Educ. Res. Pract., 2018, 19, 1117–1141. 8. O. M. Crandell, M. A. Lockhart and M. M. Cooper, J. Chem. Educ., 2020, 97, 313–327. 9. G. Bhattacharyya and G. M. Bodner, J. Chem. Educ., 2005, 82, 1402. 10. G. Bhattacharyya, Chem. Educ. Res. Pract., 2014, 15, 594–609. 11. R. S. Russ, R. E. Scherr, D. Hammer and J. Mikeska, Sci. Educ., 2008, 92, 499–525. 12. C. Krist, C. V. Schwarz and B. J. Reiser, J. Learn. Sci., 2019, 28, 160–205. 13. D. C. Phillips and J. F. Soltis, Perspectives on Learning, Teachers College Press, New York, 2004. 14. National Research Council, A Framework for K-12 Science Education: Practices, Crosscutting Concepts, and Core Ideas, National Academies Press, Washington, D.C., 2012. 15. J.-M. G. Rodriguez, K. H. Hunter, L. J. Scharlott and N. M. Becker, J. Chem. Educ., 2020, 97, 3506–3520. 16. J.-M. G. Rodriguez and M. H. Towns, Chem. Educ. Res. Pract., 2021, 22, 1020–1034. 17. A. A. diSessa, in The Cambridge Handbook of the Learning Sciences, Cambridge University Press, New York, 2006. 18. G. J. Posner, K. A. Strike, P. W. Hewson and W. A. Gertzog, Sci. Educ., 1982, 66, 211–227. 19. T. S. Kuhn, The Structure of Scientific Revolutions, University of Chicago Press, Chicago, 1962. 20. A. A. diSessa, in Constructivisim in the Computer Age, Lawrence Erlbaum Associates, Hillsdale, NJ, 1988, pp. 49–70. 21. A. A. diSessa, Cognit. Instr., 1993, 10, 105–225. 22. D. Hammer, A. Elby, R. E. Scherr and E. F. Redish, in Transfer of Learning from a Modern Multidisciplinary Perspective, Information Age Publishing, Greenwich, CT, 2005, pp. 89–120. 23. E. C. Sayre and M. C. Wittmann, Phys. Rev. Spec. Top. – Phys. Educ. Res., 2008, 4, 020105. 24. M. M. Cooper, L. M. Corley and S. M. Underwood, J. Res. Sci. Teach., 2013, 50, 699–721. 25. D. Hammer, Am. J. Phys., 2000, 68, S52–S59. 26. R. L. Stowe and M. M. Cooper, Isr. J. Chem., 2019, 59, 598–607. 27. K. P. Kohn, S. M. Underwood and M. M. Cooper, CBE—Life Sci. Educ., 2018, 17, ar3. 28. R. S. Russ and L. K. Berland, J. Learn. Sci., 2019, 28, 279–301. 29. R. L. Stowe, B. J. Esselman, V. R. Ralph, A. J. Ellison, J. D. Martell, K. S. DeGlopper and C. E. Schwarz, J. Chem. Educ., 2020, 97, 2408–2420. 30. R. L. Stowe, L. J. Scharlott, V. R. Ralph, N. M. Becker and M. M. Cooper, J. Chem. Educ., 2021, 98, 2490–2495.

The Central Importance of Assessing “Doing Science” to Research and Instruction

337

31. C. V. Schwarz, C. Passmore and B. J. Reiser, Helping Students Make Sense of the World Using Next Generation Science and Engineering Practices, National Science Teachers Association, 2017. 32. M. M. Cooper, L. A. Posey and S. M. Underwood, J. Chem. Educ., 2017, 94, 541–548. 33. M. M. Cooper, M. D. Caballero, D. Ebert-May, C. L. Fata-Hartley, S. E. Jardeleza, J. S. Krajcik, J. T. Laverty, R. L. Matz, L. A. Posey and S. M. Underwood, Science, 2015, 350, 281–282. 34. National Research Council, Developing Assessments for the Next Generation Science Standards, National Academies Press, Washington, D.C., 2014. 35. S. M. Underwood, L. A. Posey, D. G. Herrington, J. H. Carmel and M. M. Cooper, J. Chem. Educ., 2018, 95, 207–217. 36. J. T. Laverty, S. M. Underwood, R. L. Matz, L. A. Posey, J. H. Carmel, M. D. Caballero, C. L. Fata-Hartley, D. Ebert-May, S. E. Jardeleza and M. M. Cooper, PLoS One, 2016, 11, e0162333. 37. R. L. Matz, C. L. Fata-Hartley, L. A. Posey, J. T. Laverty, S. M. Underwood, J. H. Carmel, D. G. Herrington, R. L. Stowe, M. D. Caballero, D. Ebert-May and M. M. Cooper, Sci. Adv., 2018, 4, eaau0554. 38. L. Ke and C. V. Schwarz, J. Res. Sci. Teach., 2021, 58, 335–365. 39. K. Noyes and M. M. Cooper, J. Chem. Educ., 2019, 96, 1821–1832. 40. N. Becker, K. Noyes and M. Cooper, J. Chem. Educ., 2016, 93, 1713–1724. 41. R. L. Stowe, D. G. Herrington, R. L. McKay and M. M. Cooper, J. Chem. Educ., 2019, 96, 1327–1340. 42. M. M. Cooper, H. Kouyoumdjian and S. M. Underwood, J. Chem. Educ., 2016, 93, 1703–1712. 43. O. M. Crandell, H. Kouyoumdjian, S. M. Underwood and M. M. Cooper, J. Chem. Educ., 2019, 96, 213–226. 44. A. T. Kararo, R. A. Colvin, M. M. Cooper and S. M. Underwood, Chem. Educ. Res. Pract., 2019, 20, 316–328. 45. M. M. Cooper, R. L. Stowe, O. M. Crandell and M. W. Klymkowsky, J. Chem. Educ., 2019, 96, 1858–1872. 46. R. L. Stowe and M. M. Cooper, J. Chem. Educ., 2019, 96, 2072–2085. 47. E. Miller, E. Manz, R. Russ, D. Stroupe and L. Berland, J. Res. Sci. Teach., 2018, 55, 1053–1075. 48. J. L. Lemke, Talking Science: Language, Learning, and Values, Ablex Pub. Corp, Norwood, N.J, 1990. 49. T. O. B. Odden and R. S. Russ, Sci. Educ., 2019, 103, 187–205. 50. R. E. Scherr and D. Hammer, Cogn. Instr., 2009, 27, 147–174. 51. D. Tannen, Framing in Discourse, Oxford University Press, New York, 1993. 52. National Research Council, Next Generation Science Standards: For States, by States, National Academies Press, Washington, D.C., 2013. 53. T. O. B. Odden and R. S. Russ, Int. J. Sci. Educ., 2019, 41, 1052–1070. 54. S. Rosenberg, D. Hammer and J. Phelan, J. Learn. Sci., 2006, 15, 261–292. 55. J. Watkins, L. Z. Jaber, D. Hammer, J. Radoff and A. M. Phillips, J. Res. Sci. Teach., 2018, 55, 573–599. 56. K. Shar, R. S. Russ and J. T. Laverty, Phys. Rev. Phys. Educ. Res., 2020, 16, 020101.

Postface Outlook—Reasoning Across Scales and Contexts Researching student learning experiences in organic chemistry advanced largely in the last 10 years. Examining current interests of research studies in Organic Chemistry Education, disciplinary differences became apparent, where the molecular representations and mechanisms in organic chemistry challenge learners differently than in general chemistry, creating a high cognitive load when transitioning between subjects. Recognizing relevant features within a representation, inferring the implicit properties, and providing cause-effect relationships, requires researchers to understand how students approach and reason with representations and how instructors, active learning pedagogies, and assessment practices are shaping the learning experience. The findings illustrated in this book highlighted various aspects of students' reasoning in small-scale teaching situations, such as group interactions and individual problem-solving situations, as well as in large-scale curriculum and assessment changes revealing future areas of research that may further advance learning in organic chemistry. Included in the book were multiple in-depth qualitative studies on student reasoning, either in natural classroom settings or interviews, providing a nuanced picture of students' learning and eliciting how students' reasoning unfolds in a moment-by-moment perspective. With an increase of active learning pedagogies applied by instructors in the classroom or restructures of curricula, there is a shift in large classes toward small collaborative group work, allowing better observation of how learning processes in natural settings unfold. While other studies are primarily concerned with characterizing student reasoning, recent approaches are focusing on the interplay between aspects of learning in the classroom, such as discursive patterns with peers,   Advances in Chemistry Education Series No. 10 Student Reasoning in Organic Chemistry: Research Advances and Evidence-based Instructional Practices Edited by Nicole Graulich and Ginger Shultz © The Royal Society of Chemistry 2023 Published by the Royal Society of Chemistry, www.rsc.org

338

Postface

339

engagement with material and scaffolds, as well as instructors or learning assistants, the role of their pedagogical content knowledge, and their influence on student reasoning and argumentation. This ongoing research may also lead to new approaches to understanding multiple facets of learning in smaller settings. Besides teaching-related factors, the activation and use of knowledge resources in student reasoning is also impacted by epistemic stances or individual epistemologies of students. Acknowledging how students perceive the nature of knowledge can help understand how reasoning unfolds and how teaching approaches can positively impact student epistemologies. Absent were studies on learners' identity as chemists and other affective components that may influence how they activate and use resources in their reasoning and otherwise take up disciplinary thinking in organic chemistry. These aspects of learning are gaining increased attention in chemistry education, and we expect future work focusing on their relationship to reasoning in organic chemistry. Chapters in this book focusing on student mechanistic reasoning agree that describing the how and why of organic transformations is more challenging than describing what happens. However, students reasoning about multiple alternative reaction pathways and multiple concepts at a time seem to be an emerging challenge as well. This includes students recognizing and using multiple concepts and the process of weighing them. Further research is needed to clarify instructional requirements and the scaffolds necessary to support the inference of multiple concepts and the ability to weigh them across reaction contexts. This is instructionally challenging and awaits further attention. As a reasoning process typically starts with the representation at hand, understanding how students use and interpret representations may inform individual pedagogical approaches and the design of tailored instructions. Based on ongoing research it seems apparent, however, that simply guiding students visually might not be sufficient to affect their reasoning nor their success depending on the visual complexity of a given representations, e.g., such as organic mechanisms. Therefore, future research should focus on how to simultaneously support students conceptually and visually in their (mechanistic) reasoning while using a range of different visually complex representations, ranging from recognizing implicit atoms such as missing hydrogens, to three-dimensional displays of molecules and reaction processes. More tailored and context specific scaffoldings could be derived by acknowledging the affordances of context, task prompt, and visual complexity, for instance, through investigations of the visual features of a reaction mechanism students attend to while reasoning. Using eye-tracking technology, for example, in such investigations can add to our current understanding of how students are perceiving representations and how this is linked to their prior knowledge. This technology further opens the door for a variety of instructional approaches that are based on eye-gaze feedback or replays, that have not yet been fully explored, but could be the key for a more individualized instruction when it comes to representational competence.

340

Postface

In cases that require students' spatial ability, instructors could play an important role in supporting spatial thinking through embodied actions emphasized by gesturing to convey spatial relations or interactions. New technology, providing augmented or virtual displays has the potential to support students in advancing their mental (spatial) model. For now, studies have largely focused on displaying the three-dimensional nature of molecules and reactions. However, augmentation might be a useful tool to strengthen the link between explicit representational features and implicit properties, by, for instance, augmenting visually the electron density or interactions between molecules. In addition to characterizing students' reasoning with representations and analyzing how learning unfolds, the set of chapters on assessment in organic chemistry largely agree that the assessment practices need to change and assessments that only require students to draw pictures of reaction mechanisms are insufficient to evaluate understanding. Assessments need to incentivize student engagement in the practices they are expected to learn as well as providing appropriate feedback. Such assessments are challenging to design because they go beyond recalling canonical facts. Emphasizing and assessing more meaningful and mechanistic reasoning respectively in large classes requires new approaches for nuanced feedback. Automated text analysis and other machine learning methods have the potential of making formative assessment of student reasoning in written responses more tractable for educators and more suitable for adaptive learning. Recent advances go beyond the context dependency of automated text analysis and use predictive models for a variety of reaction contexts. Combining machine learning methods with appropriate educational resources and learning material can drastically change how organic chemistry and especially reasoning is taught and assessed. Using machine learning, for instance, over time to track and monitor students’ learning progressions from organic chemistry I to more advanced courses, are new avenues for research. Besides refining and testing these machine learning approaches, we should include the instructors' perspective in this process to reduce feelings of rejection or fear of being replaced, but as well to design appropriate educational materials for instructors that aid them in using these tools effectively. At the end, evidenced-based practice is implemented by the instructor in the classroom, and thus, their perspective needs to be valued. Nicole Graulich Ginger Shultz

Biographies of Authors Gyde Asmussen (she/her/hers) is a PhD Student in the Department of Chemistry Education at the IPN—Leibniz-Institute for Science and Mathematics Education. Emily Atieh (she/her/hers) is a postdoctoral researcher in the Chemistry department at the University of Virginia. Working under Dr Marilyne Stains, her research mainly focuses on the professional development of higher education faculty. Sascha Bernholt (he/him/his) is a Research Scientist in the Department of Chemistry Education at the IPN—Leibniz-Institute for Science and Mathematics Education. Gautam Bhattacharyya (he/him/his) earned his PhD degree at Purdue University in 2004 under the direction of Prof. George Bodner and joined the Department of Chemistry at Missouri State University in 2014 where he is an Associate Professor. Ryan Britt (he/him/his) is a graduate student at the University of Northern Colorado. Nikita Burrows, PhD (she/her/hers) is an Assistant Professor of Chemistry at Monmouth University. Her research focuses on the role of the laboratory in undergraduate chemistry education and the development of alternative assessments to standard multiple-choice formats. Myriam S. Carle (she/her/hers), MSc, is a PhD Candidate at the University of Ottawa and has a MSc in Organic Chemistry from the University of Waterloo. Her research focuses on using learning outcomes to promote student success. Ira Caspari-Gnann (she/her/hers) is an Assistant Professor in the Departments of Chemistry and Education at Tufts University.   Advances in Chemistry Education Series No. 10 Student Reasoning in Organic Chemistry: Research Advances and Evidence-based Instructional Practices Edited by Nicole Graulich and Ginger Shultz © The Royal Society of Chemistry 2023 Published by the Royal Society of Chemistry, www.rsc.org

341

342

Biographies of Authors

Olivia Crandell (she/her/hers) is a Postdoctoral Research Associate at the University of Wisconsin, Madison. Melanie Cooper (she/her/hers) is Lappan-Phillips Professor of Chemistry at Michigan State University. Her research focuses on curriculum design and assessment for general and organic chemistry. Kimberly DeGlopper is a graduate student in Ryan Stowe's group at the University of Wisconsin, Madison. Her research focuses on understanding the epistemologies of organic chemistry students and instructors. Jacky M. Deng (he/him/his) is a PhD Candidate at the University of Ottawa. His research focuses on investigating students' reasoning in chemistry in equitable and inclusive ways. Amber J. Dood (she/her/hers) is a graduate of the Chemical Education Research program at the University of South Florida. Dr Dood is a postdoctoral research associate at the University of Michigan. Julia Eckhard (she/her/hers) is a PhD Student at the Institute of Chemistry Education at the Justus-Liebig-University Giessen. Aubrey Ellison is an Associate Organic Lab Director and Lecturer at the University of Wisconsin–Madison. She enjoys working collaboratively with others to develop and deliver curricular material for the lab and lecture courses. Brian Esselman is an Associate Organic Lab Director and Lecturer at the University of Wisconsin, Madison. His research focuses are developing chemistry curricula involving student application of models to rationalize chemical phenomena and rotational spectroscopy of small organic molecules. Alison B. Flynn (she/her/hers) is an Associate Professor in the Department of Chemistry and Biomolecular Sciences and Associate Vice Provost, Academic Affairs at the University of Ottawa. Through their research, she and her group seek evidence-informed ways to improve education in chemistry and beyond in higher education. Sujani Gamage, M. S. (she/her/hers) is a PhD candidate in the Department of Chemistry at Georgia State University. Her research is focused on organic chemistry students' understanding of NMR and electrophilic aromatic substitution. Nicole Graulich (she/her/hers) is an associate professor of chemistry education at the Justus-Liebig-University in Giessen, Germany. Jolanda Hermanns (she/her/hers) works as a researcher at the University of Potsdam. David Keller (he/him/his) works as a PhD student in the group of Jolanda Hermanns at the University of Potsdam. Sebastian Keller (he/him/his) studied chemistry and computer-science for a secondary teacher degree at the University of Duisburg-Essen, Germany. Sebastian Habig (he/him/his) is a full professor of chemistry education at the Friedrich-Alexander-University Erlangen-Nuremberg, Germany. Julia Hoang (she/her/hers) completed her BA in Exercise and Sport Science at the University of North Carolina at Chapel Hill and is currently a

Biographies of Authors

343

post-baccalaureate researcher in the Popova group at the University of North Carolina at Greensboro. Saša A. Horvat is an assistant professor of Chemistry Teaching Methods in the Faculty of Sciences, University of Novi Sad, Republic of Serbia. His current focus is on cognitive complexity measures, primarily on the application of Knowledge Space Theory in cognitive complexity evaluation, systemic approach to teaching and learning chemistry, misconceptions in chemistry and he serves as an editorial board member of Problems of Education in the 21st Century journal. Jessica Karch (she/her/hers) is a postdoctoral scholar in the Caspari group. She earned her PhD from University of Massachusetts Boston in 2021. M. Ed. David Keller works as a PhD student in the group of Jolanda Hermanns. Leonie Lieber (she/her/hers) is a PhD student at the Institute of Chemistry Education at the Justus-Liebig-University in Giessen, Germany. She receives a Kekulé Fellowship from the Verband der Chemischen Industrie (German Chemical Industry Association). Matthew Lira is an Assistant Professor of Educational Psychology & Learning Sciences at the University of Iowa. Jherian Mitchell-Jones (she/her/hers) is a graduate student researcher in the Chemistry department at the University of Virginia working under Dr Marilyne Stains. Suazette Mooring, PhD (she/her/hers) is an Associate Professor of Chemistry at Georgia State University. Her research focuses on chemistry education with a primary interest on evidence-based practices in Organic Chemistry and students' conceptual understanding of core concepts throughout the chemistry curriculum. She is also an Associate Editor for the Journal of Chemical Education. Maia Popova (she/her/hers) completed her PhD in Chemistry with Dr Stacey Lowery Bretz at Miami University and her postdoc with Dr Marilyne Stains at the University of Nebraska, Lincoln. Currently, she is an assistant professor at the University of North Carolina at Greensboro. Jeffrey R. Raker (he/him/his) is an Associate Professor in the Department of Chemistry at the University of South Florida and the Associate Director of ACS Exams. Dr Raker's research is in the areas of learning across the postsecondary chemistry curriculum, measures of affect chemistry instructional contexts, and applications of survey research methodologies for understanding postsecondary STEM education instructional practices. Marc Rodemer (he/him/his) is a Research Scientist in the Department of Chemistry Education at the University Duisburg-Essen. Dušica D. Rodić is an associate professor of Chemistry Teaching Methods in the Faculty of Sciences, University of Novi Sad, Republic of Serbia. Her principal research interests are chemistry triplet relationship, evaluation of mental effort, cognitive complexity, development of multi-tier tests, systemic approach to teaching and learning chemistry, knowledge space theory applications, misconceptions in chemistry and she serves as an editorial board

344

Biographies of Authors

member of the Journal of Baltic Science Education, J-PEK journal (Jurnal Pembelajaran Kimia) and editorial advisory board member of Journal of Chemical Education. Tamara N. Rončević is an assistant professor of Chemistry Teaching Methods in the Faculty of Sciences, University of Novi Sad, Republic of Serbia. She brings expertise regarding systemic approach to teaching and learning chemistry, systems thinking, cognitive load, illustrative methods in teaching chemistry, triplet model of knowledge representation, and serves as an editorial board member of the J-PEK journal (Jurnal Pembelajaran Kimia). Fridah Rotich (she/her/hers) completed her BS in Chemistry at Davidson College. Currently, she is a third-year graduate student in the Popova and Cech groups at the University of North Carolina at Greensboro. Catharina Schmitt (she/her/hers) studied chemistry and history for the teaching profession at the Philipps University of Marburg and worked as a research assistant in the group of Michael Schween. She works as a secondary school teacher at the Sachsenwald School Reinbek, Germany. Cara Schwarz is a graduate student in Ryan Stowe's group at the University of Wisconsin, Madison. She is interested in developing curricular materials and studying student reasoning in high school and organic chemistry learning environments. Michael Schween (he/him/his) studied chemistry, German language and literature for the teaching profession and obtained a doctorate in organic chemistry at Philipps University of Marburg, Germany. He is a professor at the Faculty of Chemistry. Stephanie Scopelitis is an educator and educational consultant. Gulten Sendur is an associate professor of chemistry education at Dokuz Eylul University, Turkey. Ginger Shultz (she/her/hers) is an Assistant Professor of Chemistry at the University of Michigan. Marilyne Stains (she/her/hers) is an Associate Professor in the Chemistry department at the University of Virginia. Her research group focuses on enhancing students' learning environments in STEM courses by characterizing factors influencing STEM faculty members' instructional practices. Mike Stieff is Professor of Chemistry & Learning Sciences at the University of Illinois, Chicago. Ryan Stowe is an Assistant Professor of Chemistry at the University of Wisconsin, Madison. He leads a research group that studies how chemistry learning environments could and should support students in making sense of phenomena in terms of molecular behaviour. Andreas Trabert (he/him/his) studied chemistry and geography for the teaching profession and obtained his PhD in chemistry education at the Philipps University Marburg. He completed in-service teacher training at LI Hamburg. Today, he works as a secondary school teacher at the Centre for Adult Education Hamburg, Germany.

Biographies of Authors

345

Katie Walsh (she/they) is a writer and teacher from Somerville, MA, unaffiliated with any institution. She got her undergraduate degree at Lesley University. Field Watts (he/him/his) is a graduate student in the Department of Chemistry at the University of Michigan. Melissa Weinrich (she/her/hers) is an associate professor of chemical education at the University of Northern Colorado. Lyniesha Wright Ward (she/her/hers) completed her PhD in Chemistry with Dr Maria Oliver-Hoyo at North Carolina State University and is currently a postdoctoral fellow in the Popova group at the University of North Carolina at Greensboro. Dihua Xue (she/her/hers) is an alumna of Dr Marilyne Stains' group and currently teaching chemistry as a lecturer at the University of Minnesota Rochester. Brandon J. Yik (he/him/his) is a graduate student in the Chemical Education Research program at the University of South Florida.

Subject Index 3D images see augmented reality (AR) studies 3D Learning Assessment Protocol (Laverty) 326–328 3D tasks/assessments 326, 327–328, 329, 330 ‘absent’ level of explanation 311, 314 abstractness in reasoning 5–6, 8, 10 ‘acceptance’ epistemic stance 114, 115, 117 accommodation in conceptual change 323–324 accuracy model assessment 292, 297, 298, 299, 309 acetaldehyde 216, 274–275 acetylsalicylic acid 76 acidic protons identification 145, 148, 150–153 acid–base examples causal mechanistic reasoning 62–64, 70, 71, 82 eliciting student mechanistic reasoning 286–287 epistemic stance study 117 indicators 260 scaffolded reasoning prompts 64–66 activation chains 106 ‘active’ ICAP engagement level 166 active learning pedagogies see flipped classrooms; in-themoment learning activities of entities 130, 132–134

algorithm selection in machine-learning 309 aliphatic hydrocarbons 183–185 alkaline ester hydrolysis 258–259 alkanes 168, 183–184 alkenes 272, 322–323 alkyl halides 272 Amelia (LA) in case study 150–151, 153, 154, 155 Anzovino/Bretz electrophiles study 313, 314 AR see Augmented Reality (AR) studies areas of interest (AOIs) 5, 7, 13–14, 15 see also eye tracking studies argumentation constructive alignment of evidence/claims 74–85 epistemic stances 111–112 flipped course student dialogue 167 group quizzes study in flipped classrooms 174 network analysis of concept use 90–106 rebuttals 167 scaffolding questions 175 varies between tasks 78–80 ‘what?/why?’ study 125–139 see also interviews aromatic compound SAQs diagram 190, 191 arrow drawings see curved arrow studies 346

Subject Index

ASCI (Attitude toward the Subject of Chemistry Inventory) 164 Ashley/Candice/Tiffany case study 168–171, 173–174, 175, 176 assessment 285–301, 304–316, 320–335 tools meaningful understanding in SATL 181 systemic assessment questions 181, 183–187, 188–192 see also assessment of assessment in organic chemistry; ‘doing science’ assessment study assessment of assessment in organic chemistry 269–283 background/scope 269–271 electron-pushing mechanisms 270, 278–281 traditional tasks 278–281 individual reactions 270, 271–273 non-traditional mechanistic reasoning tasks 271, 281 synthesis 270, 273–278 non-traditional assessment 276–278 traditional tasks: student solutions 273–276 ATA (automated text analysis) models 287–288, 289 attack-split off 130, 132 Attitude toward the Subject of Chemistry Inventory (ASCI) 164 Augmented Reality (AR) studies 19–33 discussion 32–33 instructional aid 22–23 multiple external representations 19–20 results 26–32 sample/design 24–26 spatial reasoning 21–22 study aims 24

347

tasks set 25–32 three factors (Azuma) 22, 23 Augmented Reality Chemistry (ARC) app 24, 25 Ausubel’s ‘meaningful understanding’ 180–193, 216, 228 automated analysis see machine learning models study automated text analysis (ATA) models 287–288, 289 CNN models development 297 development in WTL study 292–293 training machine learning models for 308–309 axial coding 39 Azuma’s AR ‘three factors’ 22, 23 see also Augmented Reality (AR) studies backward chaining see forward/ backward chaining ball-and-stick representations 4, 28–31 base comparisons/argumentation 78–80 base-free Wittig reaction 290, 291, 293, 297 ‘belief’ epistemic stance 114, 115, 116, 117, 119, 121, 122 benzylic carbonation 329 ‘Betty’ spatial thinking case study 238–241 betweenness centrality parameter 93, 96, 101, 102, 104 Bhattacharyya/Bodner’s eight electron-pushing tasks 278–279, 280, 282 Bhattacharyya’s mechanistic reasoning 59–60, 61 blackboards and embodied actions 238–239 Bode/Deng/Flynn student reasoning categorisation 3 bonding codes 133, 135, 136

348

bridging concepts 106, 148, 149, 150 Brittany/Dawn/Farrah/Jasmine case study 168, 169, 171–174, 175, 176 bromoethane 274–275 Brønsted acid–base theory 63, 131, 132, 136–137 Brønsted Causal/Descriptive responses 63, 68 butane 331, 332 C-R-M (Schönborn/Anderson’s) model 37–39, 40–41, 43–46, 50–52 canyon–bridge analogy 148, 149, 150 caption-writing see ‘what?/why?’ study case comparison tasks see contrasting cases (CCs) Caspari et al ‘chaining’ model 249, 250 Caspari/Graulich teaching scaffold 261 causal mechanistic reasoning 59–72 arrow drawings and reasoning 70–71 Caspari et al ‘chaining’ model 249, 250 concept relevance 91–92 constructivist/analogy-based learning 249 Crandell’s definition 287 curricular/task design 250 definitions 59–60 different reaction examples 62–64 electron movement 59–62 Goodwin framework 249 knowledge structure organization 92–93 machine learning models study 293–301 network analysis of concept use 90–106

Subject Index

non-traditional electronpushing tasks 281 organic chemistry examples 66–70 scaffolding importance 64–66 secondary education 256–257 SSQs study 216–217 student problems 91 tertiary education 257–260 use by students 271–272 ‘what?/why?’ study 125–139 WTL study analysis 291–292 see also Lewis causal mechanistic reasoning causal/descriptive reasoning study see ‘what?/why?’ study CCs see contrasting cases (CCs) CER (Toulmin’s claim-evidencereasoning) model 21, 82–84 ‘chaining’ model (Caspari et al) 249, 250 chaining reasoning strategies 4, 6–7, 10, 12–13 charge codes 135–136 Chemical Thinking curriculum 145 Chemistry, Life, the Universe and Everything (CLUE) courses 7, 66, 67, 68, 70–72 chemistry-based reasoning 270, 272, 273, 279, 282 Chi’s ICAP framework 166–169, 173–174, 175 chlorobutanol 113, 139, 141 chocolate example 61 Christian/Talanquer’s three social interaction types 143, 150 citizens’ understanding/reasoning 74–85 claim-evidence-reasoning (CER) model (Toulmin) 21, 82–84 claims see constructive alignment of evidence/claims clicker questions 163, 277, 307 CLUE (Chemistry, Life, the Universe and Everything) course 7, 66, 67, 68, 70–72

Subject Index

CNNs (convolutional neural networks) 292, 297, 299 ‘co-construction’ social interaction type 143, 167 coding schemes 3D tasks/assessments 329 bonding codes 133, 135, 136 epistemic stance study 114 group quizzes study in flipped classrooms 169 interview transcripts 39–40 Lewis Causal Mechanistic reasoning 63 machine learning models study 291–292, 293, 294–297 network study of concept use 96–97 resonance structures PCK study 203 SSynQs study 221, 222–227 systemic synthesis questions study 226–227 video microanalysis of embodied actions 238 ‘what?/why?’ study 127–129 cognition 288, 323–324, 325 cognitive process theory (Flower/ Hayes) 288 cognitive task analysis 96 Cognitive Theory of Multimedia Learning (Mayer) 20, 22 Cohen’s kappa 292, 297, 298, 299, 309 coherence criterion 253 coherent learning environment 252–255 collective PCK (cPCK) 198, 199 compare–predict–observe–explain (CPOE) cycle 250, 276, 288 comparisons dimension 75–78, 84 ‘complex’ level of explanation 311, 312, 313, 314 concept use/understanding categorizations 215

349

causal mechanistic reasoning 91–92 centrality of concepts 104 comprehensive learning environment 253 conceptual models 217–219, 223–227 knowledge structure organization 92–93 maps 181–186, 218–219 network analysis 90–106 network study 93–94, 95–98 Novak, Nieswandt 225 range of concepts 179–180 resonance structures 195–211 systemic synthesis questions study 214–229 theories of cognition 323 conceptual-mode (C-M) factor 38 conceptual-reasoning-mode (C-R-M) factor 37–39, 40–41, 43–46, 50–52 conductance analytics 260 confusion matrix 297, 298, 299 connectional understanding see relational understanding Consensus Model of PCK 198–199 constructed reponse items 287 constructive alignment of evidence/claims 74–85 arguments vary between tasks 78–80 definition 80–81 reasoning/granularity/ comparisons 75–78 supporting student learning 80–84 ‘constructive’ ICAP engagement level 166, 167 contrasting cases (CCs) free association to goaldirected problem-solving 94–106 tasks and flasks 250, 251, 253, 256, 257, 258, 259, 262

350

convolutional neural networks (CNNs) 292, 297, 299 Cooper’s constructed reponse items 286 Cooper’s observation/interpretation/cognition study 324–325 cPCK (collective PCK) 198, 199 CPOE (compare–predict–observe– explain) cycle 250, 276, 288 Crandell’s causal mechanistic reasoning 287 curricular influences 71, 72, 250, 256–260, 304–316 curved arrow studies areas of interest 5, 7, 13–14, 15 causal mechanistic reasoning 70–71 data analysis 8–11 electron-pushing formalism 59–62 eye tracking 7, 8 research questions 7, 13–14 results/conclusions 11–17 success in student reasoning 14–16 working methods 7–11 dash-wedge diagrams (DWDs) definition 37 embodied actions teaching 242–243 interpretation tasks 41–42 Newman projection conversion 27–28, 48–49 perceptual learning study 233, 234 representational competence study 36–54 translation tasks 27, 42–49 data analysis curved arrow studies 8–11 epistemic stance study 114 group quizzes study 168–169 in-the-moment learning 146 machine learning models study 291–293

Subject Index

network study of concept use 95–98 resonance structures study 202–203 data collection curved arrow studies 8 group quizzes study 168–169 in-the-moment learning 145–146 machine-learning studies 291, 308–309 network study of concept use 95–98 resonance structures study 200–202 deductive coding 39 Dee/Carl/Crystal case study examples 41–52 deep structure/reasoning creating the learning environment 252–253 Flynn/Ogilvie’s scheme 254, 255 Goodwin framework 258 inventing with contrasting cases 250 ‘task and flask’ bridging study 262 degree centrality parameter 93, 96, 99, 101 description/explanation difference 65–66 ‘descriptive’ level of explanation 311, 312, 314 descriptive reasoning mode (Sevian/Talanquer) constructive alignment study 75, 76, 77, 83 ‘what?/why?’ study 126, 128, 129 descriptive-causal reasoning category 129, 134 diagrams see systemic diagrams dienes/dienophiles 31–32 1,2-difluoroethane 331, 333 disagreement between peers 175

Subject Index

‘disbelief’ epistemic stance 114, 119, 120, 121, 122 disciplining perception spatial thinking study 232–245 conclusions 244–245 cross-case analysis 243–244 embodied actions 233, 235–245 large lecture halls 241–243 methods 238 perceptual learning with visual representations 234–235 small groups/spaces 237–239 steps for spatial thinking 238–241 study aims 237 discodermolide 277 diSessa’s ‘p-prims’ 324 distinction, system, relationship, and perspective (DSRP) model 187 ‘doing chemistry’ causal explanations 325 ‘doing it’ embodied actions strategy 239 ‘doing science’ assessment study 320–335 3D assessments 329–333 assessing work aligned with practice 325–329 assessment consensus view 321 conclusions 335 future research directions 333–335 domain-general reasoning 272, 273, 279, 282 Dood’s lexical analysis 287 ‘doubting’ epistemic stance 114–122 drawings 215, 224, 304–305, 306, 316 see also representation types DSRP (distinction, system, relationship, and perspective) model 187

351

DWDs see dash-wedge diagrams (DWDs) electron-pushing formalism (EPF) 59–62, 180, 286 electron-pushing mechanisms (EPMs) 270, 272, 278–281 electronic substituents 254, 255, 257–260 electrophiles 312–313, 314 electrostatic interactions 62, 63 electrostatic potential maps 4 embodied actions 233, 235–245 enacted PCK (ePCK) 199, 203–211 EPF (electron-pushing formalism) 180 ‘epistemic frame’ of student understanding 334 epistemic stances study 110–123 conclusion/implications 122–123 definition 111, 112 research questions 113 results/discussion 115–121 study data analysis 114 study design/methods 113–114 epistemology as social practice 144 ethane 220, 224, 225, 226, 227, 324–325 ethanol/ethanal 220, 224, 225, 324–325 ethnicity 145 evidence see constructive alignment of evidence/claims exam questions 331 exemplarity 252 expectations communication 80, 82, 83–84 experiment-supported ‘deep reasoning’ see ‘task and flask’ bridging study expert solutions 93–106 see also meaningful understanding

352

explanation sophistication 312–314 four levels 311–312, 313, 314, 316 explicit electron movement code 130, 133 eye tracking studies 4, 5, 7, 13–14, 15 Fahmy/Lagowski’s different types of SAQ 184–185, 189, 219–221 feedback for written explanations 305 fill-in-the-blank systemic diagrams 220–229 Finkenstaedt-Quinn WTL assignments structure 289–290 Fischer projections 233, 235 fixation duration 5, 7, 13–14, 15 fixation-to-transition ratios 5 flasks see ‘task and flask’ bridging study flipped classrooms 161–176 conclusions/implications 174–176 findings 169–174 group work, in-class 163–165 quizzes, pre-class/in-class/ online 163 student dialogue case study 165–169 student response systems, in-class 163 videos, pre-class 162–163 Flower/Hayes cognitive process theory 288 Flynn/Ogilvie’s scheme 254, 255 Fortuin’s conceptual models 218–219 forward/backward chaining 4, 6–7, 10, 12–13 ‘foundational’ level of explanation 311, 313, 314 ‘frame’ of student understanding 334–335 framework for evaluating understanding 15, 310–313, 315

Subject Index

Framework for K-12 Science Education 326 free electron pair attacks 133 Friedel–Crafts acylation 191, 192 gaps needing explanation 144, 146, 147–156 General Descriptive response 63, 68, 69 generalizable predictive models 307–308 generalized terminology 10, 12 generation tasks 44–46, 48–49 gestures 235–237, 244, 245 Gibbs–Helmholtz equation 256, 957 goal-directed problem-solving network analysis see network analysis of concept use Goodwin framework 249, 258 graduate teaching assistants (GTAs) 197 ‘granularity’ dimension 75–78, 84 Graulich’s iceberg analogy 180 group quizzes study 166–169 argumentation/student reasoning 167, 174 conclusions/implications 174 course context/participants 167–168 data collection/analysis 168–169 findings 169–174 formats of quizzes 168 group quiz format 168 ICAP framework/analysis 166–169, 173–174 group work/discussions 152–155, 163–165 GTAs (graduate teaching assistants) 197 Ha/Imani case study 153–154 Hammer/Elby’s ‘resource’ 111, 112, 114, 122, 324, 325 Harper case study 151, 155

Subject Index

Hegarty/Waller’s hybrid reasoning process 21 higher order thinking skills (HOTS) 186, 189, 190, 215 Hrin’s classification of student concepts 187 hybrid reasoning process (Hegarty/ Waller) 21 hybrid structures 196–197 see also resonance structures study hydrocarbons/halogen derivatives 189, 221–229 hydrochloric acid/water reaction 66–70 hydroxide 113, 115–121 ICAP (interactive–constructive– active–passive) framework 166–169, 173–174, 175 ICC (inventing with contrasting cases) 250 iceberg analogy (Graulich) 180 ‘imagine doing it’ embodied actions strategy 239 implicit/explicit information codes in machine learning models study 296 curved arrow studies 4, 5–6, 8, 10, 11–12 explicit electron movement code 130, 133 implicit atoms in NPs/DWDs 53 use by students 142, 146 in-class activities 163–165 in-the-moment learning study 141–156 acidic protons identification 145 conclusions/implications 155–156 data analysis 146 data collection 145–146 ethnicity 145 gap patterns 147–156

353

gaps needing explanation 144, 146 group discussions 152–155 methodology 145–146 practical epistemology analysis 143–144 results/discussion 146–155 study context 145 ‘interactive’ ICAP engagement level 166, 167 interactive lectures see in-themoment learning interactive–constructive–active– passive (ICAP) framework 166–169, 173–174, 175 intermediate stability (Flynn/Ogilvie) 254, 255 interpretation tasks 50–52, 324–325 see also ‘task and flask’ bridging study interviews assessment of assessment study 279, 280 C-R-M (Schönborn/Anderson’s) model study 39 epistemic stances study 113–121 network study of concept use 95–98 observation/interpretation/ cognition 324–325 reasoning explanations 21–22, 25, 26, 28–31 intuition 119, 120, 121 inventing with contrasting cases (ICC) 250 ‘it-gets-me-to-the-product’ strategy 279 Johnstone’s Triangle 251 journal papers 271 ‘just-in-time’ instruction 163 K-12 Framework for Science Education 197, 326

354

‘Kalli’ student responses 280 Kavita/Mia/Sofia case study 150, 154–155, 156 Kekulé structures 196 Klein’s reaction motifs 310 ‘knowing what the students know’ 321 knowledge structure organization 92–93, 197–199 ‘knowledge-in-pieces’ learning model 323, 324, 325 laboratory experiments 250 LAM (Lewis Causal Mechanistic) reasoning 62–64 Laverty’s ‘3D Learning Assessment Protocol’ 326–328 Leah/Lucas/Ben case study 148–150 learning assistants (LAs) 143, 147–152 learning environment design 252–253 learning models for conceptual change 323–324 learning outcomes (LOs) 80–84 lecture hall teaching 241–243 Lewis acid–base models 308, 309, 313, 315 Lewis Causal Mechanistic (LCM) reasoning fostering 62–64, 68, 287 scaffolded reasoning prompts 65 ‘what?/why?’ study 132–133, 134, 136 Lewis Mechanistic response (LM) 63, 68 lexical analysis (Dood) 287 see also machine learning models study; writing-tolearn (WTL) strategies linear causal reasoning mode (Sevian/Talanquer) 75, 76, 77, 83, 126 Livescribe Smartpen© 39

Subject Index

Lomask et al scoring scheme 223–227 low-stakes tasks 69, 72 machine learning models study 285–301 analysis of student writing 287–288 conclusions 301 data analysis 291–293 data collection 291 eliciting students’ mechanistic reasoning 286–287 implications 299–300 limitations 300 methods 289–290 research questions 288–289 results/discussion 293–299 theoretical framework 288 writing-to-learn strategies 286–287, 288–292 machine learning-based written explanation evaluation study assessment of learners’ understanding 306–308 evaluation framework 310–313 implications 313, 315 learner understanding of reaction mechanisms 305–306 path to better learning 316 post-secondary curricula 304–316 training machine learning models for ATA 308–309 Magnusson model see pedagogical content knowledge (PCK) study Mai/Minh case study 152 mathematics teaching research 235 Matthews correlation coeficient (MCC) 292, 297, 298, 299, 309 Mayer’s Cognitive Theory of Multimedia Learning 20, 22 Mayer’s spatial contiguity principle 23

Subject Index

MCC (Matthews correlation coeficient) 292, 297, 298, 299 meaning-making tasks 289–290 meaningful understanding 180–193 assumption from student drawings 306 Ausubel 180–193, 216, 228 nature of 310 research requirement in teaching 306 undergraduates using curved arrows 305 means-ends analysis 279, 280 mechanistic arrows see arrow drawings; curved arrow studies mechanistic reasoning see causal mechanistic reasoning memorizing/rote learning 216, 272, 277, 306, 313 mental models 217–219, 222–227 mental rotation abilities 23–33 mesomerism coding 130, 133 methyl ethanoate 216 ‘Mike’ case study in spatial thinking 241–243 Milgram’s Virtuality Continuum 22–23 misunderstandings in problemsolving 222–223, 226–227, 228, 274 mobile devices for AR access 23 model engagement measures 326, 327 modelling in causal mechanistic reasoning 59–72 modes of reasoning 38, 39, 40–41, 43–46, 50–52, 75, 76, 77, 83, 84 multi-component causal reasoning mode (Sevian/Talanquer) 75, 76, 77, 83, 126 multiple-choice questions 52–53, 82, 168, 184, 272, 307 network analysis of concept use discussion/conclusions 102–106

355

knowledge structure organization 92–93 mechanistic reasoning 90–106 method 94–98 research questions 93–94 results 98–102 theoretical background 91–93 Newman Projections (NPs) ‘3D tasks/assessments’ 326 definition 37 from dash-wedge notation 27–28, 48–49 generation tasks 44–46, 48–49 interpretation tasks 42 perceptual learning study 233, 234 representational competence study 36–54 translation tasks 42–49 using to make stability inferences 49–50 Nieswandt’s conceptual understanding 225 nodes in networks of concept use 93, 96–98 Norman’s target system/conceptual model of target system distinction 217–218 Novak’s conceptual understanding 225 NPs see Newman Projections (NPs) nucleophile–electrophile code 130, 132, 133, 135 nucleophile–electrophile concept 129, 130 nucleophilic substitution reactions assessment of assessment study 281 causal mechanistic reasoning 63, 66–70 network study of concept use 94–106 scaffolded reasoning prompts 64–66

356

nucleophilicity concept 106, 254, 255 NVivo 8, 202 nylon sutures 327–328 O-Flip-PLTL (organic chemistry flipped courses) 164–165 observations see student response systems (observations) OCLUE (Organic Chemistry, Life, the Universe and Everything) course 67, 69–72, 272, 282, 330 octet rule 131 Odden/Russ’s ‘sensemaking’/ ‘vexing questions’ 334–335 Oliver case study 152, 155 online course content 162, 163 Organic Chemistry, Life, the Universe and Everything (OCLUE) course 67, 69–72, 272, 282, 330 ‘p-prims’ (diSessa, ‘phenomenological primitives’) 324 Pabuccu/Erduran criteria 169 partial charge 6 ‘passive’ ICAP engagement level 166 PEA (practical epistemology analysis) 143–144, 147–156 pedagogical content knowledge (PCK) study 197–211 collective PCK 198, 199 consensus on 198–199 enacted PCK 199, 203–211 five components 197–198 personal PCK 199 Refined Consensus Model 198–199 resonance structures study 197–211 in sciences 197–198 peer review processes 290 peer-led team learning (PLTL) 164 peer-solutions 113, 115, 116, 122 peer-to-peer interactions 143

Subject Index

perceptual learning (PL) theory 233–237 perceptual stances 240, 242 pericyclic reactions 31–32 personal PCK (pPCK) 199 ‘phenomenological primitives’ (diSessa, ‘p-prims’) 324 pictorial representations 19–20 see also individual representation types; representational competence (RP); representations PL (perceptual learning) theory 233–237 PLTL (peer-led team learning) 164 POGIL (process-oriented guided learning enquiry) 163–164, 176 polyglycolic acid 327–328 pPCK (personal PCK) 199 practical epistemology analysis (PEA) 143–144, 147–156 pre-class activities 162–163 pre-service teachers 190, 191, 192 predictive models 307–308 preferential attachment 93 prior knowledge use 20–21, 142, 228, 323 problem types 269–283 process orientation 110, 111, 113–114, 123, 253 process-oriented guided learning enquiry (POGIL) 163–164, 176 product cards 113, 115, 116, 119, 122, 123 product conformation determination 31–32 projector screens 241–243 prompts ‘3D tasks/assessments’ 328, 329, 331, 332 causal mechanistic reasoning 64–66, 297, 301 Cooper’s constructed reponse items 286–287 Crandell’s causal mechanistic reasoning 287

Subject Index

epistemic stance study 121, 122, 123 good design 322, 323 group quizzes study in flipped classrooms 169–173 machine learning models study 290 non-traditional electronpushing tasks 281 revising to give better help 81, 82 Wood’s scaffolded opportunities 64–66 propionic acid 216 protonate-deprotonate code 130, 132 PURDUE visualization of rotation test (PSVT:R) 24, 26 ‘puzzlement’ epistemic stance 114–116, 118–122 quizzes see group quizzes study R-C (reasoning-conceptual) factor 38 R-M (reasoning-mode) factor 38, 39, 40–41, 43–46, 50–52 Raker/Towns’s ‘seven recommendations’ 276 reaction motifs (Klein) 310 reasoning dimension 75–78 reasoning explanations 21–22, 28–31 reasoning patterns 52–53 see also rule-based reasoning patterns reasoning strategies definition 40 reasoning-mode (R-M) factor 38, 39, 40–41, 43–46, 50–52 reasoning–conceptual (R-C) factor 38 rebuttals in argumentation 167 Refined Consensus Model (RCM) of PCK 198–199 relatedness between concepts (Lomask et al) 225

357

relational conceptual understanding (Graulich) 215–216 relational reasoning mode (Sevian/ Talanquer) 75, 76, 77, 83, 126 representation types curved arrow studies 3–17 multiplicity 224 resonance structures 196 structural drawings ease of use 215 visual representations in perceptual learning 232–233, 234–235 see also curved arrows; dash-wedge diagrams (DWDs); Newman Projections (NPs) representational competence (RC) study 36–54 conclusions 50–52 research implications 53–54 results/findings 40–50 role/definition 36–37 Schönborn/Anderson’s model 37–39 study design/methods 39–40 target skills 37, 39 teaching implications 52–53 research questions (RQs) 3D tasks/assessments 329–335 curved arrow studies 7, 13–14 epistemic stances 113–114 machine-learning study 288–289, 315 network analysis of concept use 93–94 PCK for resonance structures 199–200 representational competence studies 53–54 ‘task and flask’ bridging study 251–255 ‘what?/why?’ reaction mechanism study 126–127

358

resonance structures study 195–211 comparisons/argumentation 78–80 concept importance 195–196 conclusions/implications 211 data analysis 202–203 data collection 200–202 limitations 210 methods 200–203 participants 200 PCK study see pedagogical content knowledge (PCK) study results 203–207 theoretical framework 197–200 ‘resource’ in cognition (Hammer/ Elby) 324, 325 retrosynthetic analysis 274, 277, 278 rigged up perceptual systems (RUPS) 235 Robin/Taylor epistemic stance case studies 113–114, 115–121 rotatory mental abilities 23–33 rote learning 216, 272, 277, 306, 313 rubrics 83–84, 308, 313, 315 rule-based reasoning patterns 40, 45–46, 50–53, 78 RUPS (rigged up perceptual systems) 235 Russ WTL assessment framework 291, 293, 299, 301 Sankey diagrams 40–41 SAQs see systemic assessment questions (SAQs) SATL (systemic approach to teaching and learning) 180, 181–187 scaffolded opportunities (Wood) 3–4, 64–66, 82–84, 175, 261 ‘scalar level below’ concept 61

Subject Index

scale-free networks 93 Schönborn/Anderson’s model 37–39, 54 science, technology, engineering, and mathematics (STEM) 142, 237, 244–245, 308, 309, 312 scientific model/conceptualization 217–218 scientific reasoning 180–181 see also causal mechanistic reasoning; reasoning... scoring schemes see coding schemes secondary/tertiary education developments 256–260 semantic networks 96–102 ‘sensemaking’ (Odden/Russ) 334–335 sequence of events in reactions 12–13 Sevian/Talanquer reasoning modes constructive alignment study 75, 76, 77, 83 ‘what?/why?’ study 126, 128, 129 Shulman’s ‘special amalgam’ PCK idea 197 small spaces teaching 237–239 SMQs (Systemic Matching Questions) 184, 185 SN1/E1 reactions 256–257, 281 SN2 reactions 66–70, 71 social cognitive theory 165 social constructivism 165 social interactions 143, 150, 165 sophistication of explanations 311–314, 316 spatial abilities 19–33, 37, 39, 237–245 see also disciplining perception spatial thinking study spatial contiguity principle (Mayer) 23 Spearman’s correlations 10–11, 15

Subject Index

‘special amalgam’ of PCK (Shulman) 197 specific terminology 8, 10, 12 spectroscopic analysis problems 331 SSQs see systemic synthesis questions (SSQs/SSynQs) SSynQs see systemic synthesis questions (SSQs/SSynQs) stability of compounds 49–50, 106 static/dynamic information 296 STEM (science, technology, engineering, and mathematics) 142, 237, 244–245, 308, 309, 312 Strike/Posner’s conceptual change theory 323–324 student response systems (observations) 163, 206–207, 321–322, 324–325 sub-gaps 148, 149, 151, 152 subsequent gaps 147–148 success in reasoning 14–16 surgical sutures 327–328 systemic approach to teaching and learning (SATL) 180, 181–187, 219–221 systemic assessment questions (SAQs) 179–193 case study 190–192 concept/topic 2D maps 181–186 conclusions/implications 192–193 Fahmy/Lagowski’s types 184–185, 219–220 ‘meaningful understanding’ assessment 181–187 scientific reasoning skills role in meaningful understanding 180–181 systemic diagrams 183–190, 219–221 systemic synthesis questions study 219–221

359

systemic diagrams 183–190, 219–221 Systemic Matching Questions (SMQs) 184, 185 systemic synthesis questions (SSQs/SSynQs) conceptual understanding 214–216 conclusions/implications 227–229 examples 185 research question/objective 222–223 SAQ types 184 scoring scheme 223–227 student-generated 188 theoretical foundation 216–221 use in instruction 189, 190, 214–229 target systems conceptual models 217–218 target-oriented synthesis 273–276 ‘task and flask’ bridging study 248–264 conclusions 263–264 goals/design principles 252–253 implications 260–263 research current state 251–255 secondary/tertiary education developments 256–260, 262 Taylor/Robin case studies 113–114, 115–121 teaching peers as a learning activity 164 teaching scaffolds see scaffolded opportunities (Wood) ‘teaching’ social interaction type 143 temperature dependence 256–257

360

tertiary education 257–260, 262 tetrahydrofuran 113, 115–121 thalidomide 289, 290, 291, 293, 297 theoretical frameworks abstractness 5–6, 8–10 argumentation/constructive alignment 84 causal mechanistic reasoning 61 curved arrow studies 6–7 reasoning/granularity/ comparisons 75–78 ‘what?/why?’ study 126 ‘theory–theory’ learning model 323, 324, 325 thinking aloud see argumentation; interviews three-D images see augmented reality (AR) studies three-D Learning Assessment Protocol (Laverty) 326–328 three-D tasks/assessments 326, 327–328, 329, 330 Tobii T120 eye tracker 8 topic-specificity, pedagogical content knowledge 197 Toulmin’s claim-evidence-reasoning (CER) model 21, 112, 114, 122, 167 ‘Training OC’ course 127, 137–138 translation tasks 42–49 ‘tutoring’ social interaction type 143 ‘understanding’ epistemic stance 114, 116, 119, 121 Vachilous’s rubric 187 validation of machine-learning models 309 ‘vexing questions’ (Odden/Russ) 334, 335

Subject Index

videos 162–163, 238 virtual three-dimensional images see augmented reality (AR) studies Virtuality Continuum (Milgram) 22–23, 64–66 Watts’ study of writing-to-learn strategies 286, 287, 291, 293, 301 ‘what?/why?’ study activities of entities 130, 132–134 bonding code 133, 135, 136 charges code 135–136 coding scheme 127–129 explicit electron movement code 130, 133 implications 138–139 limitations 137–138 methods 127–129 nucleophile–electrophile code/ concept 129, 130, 132, 133, 135, 137 properties of entities 130–131 questions 217, 293–297 reaction mechanisms 125–139 research questions 126–127 results/discussion 129–137 theoretical background 126 ‘what?’ descriptions 129–134 ‘why?’ descriptions 134–137 Wittig base-free reaction 290, 291, 293, 297 working memory capacity 20 working methods 7–11 writing-to-learn (WTL) strategies 126–127, 286–287, 288–292 written descriptions 288–292, 304–316 automated analysis see machine learning models study