Enhancing retention in introductory chemistry courses: teaching practices and assessments 9780841235113, 0841235112

"This book is about Enhancing Retention in Introductory Chemistry Courses: Teaching Practices and Assessments"

471 101 17MB

English Pages [222] Year 2019

Report DMCA / Copyright

DOWNLOAD FILE

Polecaj historie

Enhancing retention in introductory chemistry courses: teaching practices and assessments
 9780841235113, 0841235112

Table of contents :
Enhancing Retention in Introductory Chemistry Courses: Teaching Practices and Assessments......Page 2
Enhancing Retention in Introductory Chemistry Courses: Teaching Practices and Assessments......Page 4
Library of Congress Cataloging-in-Publication Data......Page 5
Foreword......Page 6
Reflection Activities in General Chemistry Laboratories: An Active Learning Strategy to Connect Laboratories with Lectures......Page 8
Subject Index......Page 9
Current State and the Quality of STEM Education......Page 10
Student-Centered Teaching Practices......Page 11
Instructors’ Role in Improving Retention......Page 12
Adjusting Teaching Practice and Assessment: Effort That Needs  Good Intention and Courage......Page 13
Chapter 3: Implementing Metacognitive Writing in a Large Enrollment Gateway Chemistry Class: Uma Swamy and Jennifer Bartman......Page 14
Chapter 4: Improving First-Semester General Chemistry Student Success Through Retrieval Practice: Saul R. Trevino, Elizabeth Trevino, and Mary Osterloh......Page 15
Chapter 6: Adaptive Learning Technology in General Chemistry: Does It Support Student Success?: Jessica M. Fautch......Page 16
Chapter 8: Preventing Mole Concepts and Stoichiometry from Becoming “Gatekeepers” in First Year Chemistry Courses: A. M. R. P. Bopegedera......Page 17
Chapter 9: Adaptation and Assessment of a Gradual Release of Responsibility Model for a Large-Enrollment General Chemistry Course: Nicole Lapeyrouse and Cherie Yestrebsky......Page 18
Chapter 11: Changing Instructor Attitudes and Behaviors to Support Student Learning and Retention: Supaporn Kradtap Hartwell......Page 19
Conclusions or Looking Forward?......Page 20
References......Page 21
Introduction......Page 24
The Big Fish-Little Pond Effect (BFLPE) and Individual Relative Deprivation (IRD) Effect in Gateway Chemistry Courses......Page 26
Addressing the Issue of College Preparation: A Brief Literature Review on Preparatory General Chemistry Courses......Page 27
Types of Preparatory Courses and Teaching Strategies......Page 30
University and Student Profile......Page 33
General Chemistry at HPU......Page 34
Metrics for Identifying At-Risk Students and Placement in Bridging Course......Page 35
Figure 1. AI histogram of students enrolled in GC-I Fall 2014 - Fall 2016. Mean AI = 78.0, N = 623.......Page 36
Figure 3. Final GC-I grade as a function of AI, Fall 2014 - Fall 2016. (R2 = 0.356, N=623). Linear regression and variance are shown. Vertical line is located at AI = 78 used for advising into PS-C. Horizontal line is located between C- and D+ grade showing progression to General Chemistry II. W grades have been arbitrarily assigned GPA = 0.1 and shown as open circles.......Page 37
Teaching Strategies Used in Bridging Course and Approaches to Measuring the Impact of Those Strategies......Page 38
Data Collection and Analysis......Page 40
Research Question #1......Page 41
Figure 6. Sankey diagram of PS-C student outcomes from students enrolled Fall 2017 and Spring 2018 (left) to Fall 2018 GC-I (right).......Page 43
Figure 7. Fall 2018 percent grade distribution of students enrolled in GC-I Only compared to PS-C followed by GC-I.......Page 44
Figure 8. Fall 2018 semester 2009 First Term General Chemistry ACS examination percentile score as function of final course GPA for GC-I Only compared to PS-C/GC-I. Linear regression shown for each data set.......Page 45
Figure 9. Growth Mindset instrument results from the beginning (pre) and end (post) of the semester. Average scores can range from 0 to 6. Error bars are standard error of the mean.......Page 46
The Grand Challege of Gateway Courses......Page 47
Figure 11. Self-concept instrument results from the beginning (pre) and end (post) of the semester. Average sub-scale scores can range from 0 to 100%.......Page 48
Conclusions, Limitations, and Future Work......Page 49
References......Page 50
Implementing Metacognitive Writing in a Large Enrollment Gateway Chemistry Class......Page 58
Metacognition and Self Regulation......Page 60
Introducing Students to Metacognition and Self Regulation......Page 62
Using Writing to Promote Learning......Page 64
The Second Set of Prompts—Reflection on Preparation for the First Midterm......Page 68
The Fourth Set of Prompts—Starting to Reflect on Their Performance after the First Midterm......Page 69
The Fifth Set of Prompts—Reflecting on Their Performance as a Whole after the First Midterm......Page 70
Instructor Notes and Final Thoughts......Page 71
References......Page 73
Method......Page 78
Figure 1. Sample of the study sheet containing 11 polyatomic ions and 4 unit analysis tools.......Page 79
Figure 2. Round 1 Test Sample.......Page 80
Retrieval Practice Experience Metacognition Follow-Up......Page 81
Limitations of the Study......Page 82
Concluding Remarks......Page 83
References......Page 84
Introduction......Page 86
Course Approach......Page 87
In-Class Group Solution of Fundamental Chemistry Multiple-Choice Questions......Page 88
Slides, Graphics, PhET Simulations, YouTube Videos......Page 89
Figure 1. Page 1 of Class Notes 4 on Chapter 4: Chemical Quantities and Aqueous Reactions. Answers are in italics. Students draw the particulate view of the dissolved species.......Page 90
Voluntary Learning Team Meetings Facilitated by the Instructor for Extra Points......Page 91
Chapter Quizzes......Page 92
Cumulative Final Exam......Page 93
How Much Has All This Helped?......Page 94
Figure 2. Correlation, as percentage, between 140 students’ course outcomes (Passed, Failed/Dropped), Pre-Test Grades (A, B, C P-Test), and 1st Advisory Grades (A, B, C or D-F, on bars) for six CHM 221 sections.......Page 95
References......Page 97
Introduction......Page 100
Figure 1. Fall 2017 enrollment in General Chemistry I (CHM 134) by major. A total of 279 students enrolled at the start of the semester, with engineering students comprising nearly half of the population.......Page 101
Adaptive Learning......Page 102
Data Collection......Page 103
ALEKS: Adaptive Experience......Page 104
Figure 2. Performance on chemistry content questions at the start (pre) and at the end of the semester (post). The ALEKS group is in pink/red (n=10) while the non-ALEKS group is in blue (n=41).......Page 105
Student Opinion Surveys......Page 107
Figure 3. Self-reported survey responses on a scale of (1) strongly disagree to (7) strongly agree. The ALEKS group is depicted in pink/red (n=10) and non-ALEKS is in blue (n=41). The error bars indicate the standard error of the mean, while the difference in pre-post opinion is noted by the delta values to the right of the bars. For Q4, the ALEKS population had response that changed significantly.......Page 108
Final Grade Distribution......Page 109
Student Experience: Comparison between ALEKS and Non-ALEKS Homework......Page 110
Conclusions......Page 111
References......Page 112
Introduction: Specifications Grading......Page 114
The Specifications Grading Philosophy......Page 115
The High-Stakes Unit Quiz......Page 116
There Are Multiple Opportunities for a Student to Demonstrate Mastery on a Unit......Page 117
Course Details......Page 118
Defining Final Grades......Page 119
Figure 1. Key components from a General Chemistry I syllabus that uses specifications grading. A) Essential and ordinary learning outcomes, B) Exposition of unit quizzes, C) Exposition of quiz retakes, D.) Incorporation of exams, laboratory work, and participation.......Page 123
Figure 2. Most students (28/30) in Spring 2018 achieved Mastery on enough Unit Quizzes to earn their percentage-based final grade. Many students (15) achieved Mastery on more Unit Quizzes than necessary.......Page 124
Transitioning to a Fully Specifications Grading-Based Course: Fall 2018 and Spring 2019......Page 125
Define Mastery to Encourage Deep Learning......Page 126
Conclusions......Page 127
References......Page 128
Preventing Mole Concepts and Stoichiometry from Becoming “Gatekeepers” in First Year Chemistry Courses......Page 130
Designing the First Worksheet......Page 131
Advantages of the Worksheets......Page 132
The Learning Environment......Page 133
Forming Student Teams......Page 134
Evidence of Success......Page 135
Figure 2. Students working in teams of two on mole concepts and stoichiometry worksheets in a tiered lecture hall (above) and in a computer laboratory (below).......Page 136
Figure 4. Comparing students’ performance in MC&S problems in the ACS General Chemistry Standardized Exam with their overall performance in the same exam. Data are from a chemistry major’s course.......Page 137
Future Work......Page 138
References......Page 139
Motivation......Page 146
Framework......Page 147
Course Description......Page 148
Adaptations to the GRR Framework......Page 149
Student Academic Performance......Page 150
Figure 3. Overall grade comparison between Traditionally- (white) and GRR- (grey) taught courses.......Page 151
Figure 4. Divergent graph of gender grade distribution: Female (grey) and male (polka dots).......Page 152
Survey Results......Page 153
References......Page 154
Introduction......Page 156
Figure 1. Schematic representation of student perceptions on laboratory and lecture content before, during, and after the reflection activities.......Page 157
Reflection Activities to Promote Higher Level Thinking......Page 158
Lab Notebook......Page 159
Quizzes......Page 160
Teaching Assistants......Page 161
Results and Discussion......Page 162
Quizzes......Page 163
Demonstrations......Page 164
Teaching Assistants......Page 165
Figure 3. Flash cards based on student discussions. Students connected the laboratory experiment with various concentration units and pH of the solutions. Important notes from students (a) molarity uses volume (of the solution) and molality utilizes mass (of the solvent) and (b) not all salts are neutral.......Page 166
Spring 2018......Page 167
Synchronized Laboratory and Lecture Content......Page 168
Student Attitudes towards Synchronized Laboratory and Lecture Content......Page 169
Performance of LS Students in Lecture Courses......Page 170
With Reflection Activities......Page 171
Figure 4. Lab grades vs. final exam scores for LS and LD students during Spring 2014, Spring 2015, and Spring 2016 semesters. Reflection activities were not implemented in the laboratory courses for the LS students.......Page 173
Lab Grade vs. Final Exam Score......Page 174
Tips for Reflection Activities......Page 175
Acknowledgments......Page 176
References......Page 177
Changing Instructor Attitudes and Behaviors to Support Student Learning and Retention......Page 178
Rationale: Why Should We Change?  Shouldn’t It Be the Students Who Change?......Page 179
Influences of Instructor on Student Retention......Page 180
The Process of Change in Teaching Attitudes and Behaviors......Page 181
Self-Examining and Developing of Compassionate Mindset......Page 182
Teacher Empathy and Affective Attitudes Inventory......Page 184
Sharpening Interpersonal Communication and Presentation Skills......Page 185
On the First Day of Class......Page 186
During Each Class......Page 187
Giving and Grading Assignments......Page 188
Implementing and Maintaining New Teaching Behaviors and Practice......Page 189
Conclusion......Page 190
References......Page 191
Strategies to Prevent Cognitive Overload: A Team-Based Approach to Improving Student Success and Persistence in a Gateway Introductory Chemistry Course......Page 196
Introduction......Page 197
Adjunct Faculty as a Team......Page 198
Cognitive Overload......Page 199
Lecture Class Structure Management Techniques—The Study Cycle 3......Page 201
Figure 1. The study cycle outline utilized to assist in CHEM 101 course design, starting in Fall 2013. Reproduced with permission from reference 3. Copyright 2013 American Chemical Society.......Page 202
Study......Page 203
Figure 2. Example worksheet problem utilizing stoichiometry frameworks to guide the students to solve the problem 8.......Page 204
Data Analysis......Page 205
Figure 3. The percentage of students successfully completing CHEM 101 with an A, B or C grade, in a traditional classroom setting from Fall 2012 through Fall 2018 Semesters.......Page 206
Conclusion......Page 207
References......Page 208
Supraporn Kradtap Hartwell and Tanya Gupta......Page 210
Tanya Gupta......Page 212
Indexes......Page 214
Author Index......Page 216
G......Page 218
R......Page 221
S......Page 222

Citation preview

1330

STRATEGIES TO IMPROVE STUDENT RETENTION IN STEM CLASSES General chemistry courses are required for many majors and subsequent careers, and instructors work hard to facilitate student success. This volume presents strategies used by chemistry instructors to improve retention and lower the number of students retaking the course. From innovative teaching methods to alternative assessments, this toolkit of actionable ideas is sure to inspire those in faculty and administrative roles.

PUBLISHED BY THE

American Chemical Society SPONSORED BY THE

ACS Division of Chemical Education

E D U C A T I O N

VOLUME 1330

ENHANCING RETENTION IN INTRODUCTORY CHEMISTRY COURSES TEACHING PRACTICES AND ASSESSMENTS

ACS SYMPOSIUM SERIES

ACS SYMPOSIUM SERIES

ENHANCING RETENTION IN INTRODUCTORY CHEMISTRY COURSES TEACHING PRACTICES AND ASSESSMENTS

KRADTAP HARTWELL & GUPTA

S. KRADTAP HARTWELL & T. GUPTA

Enhancing Retention in Introductory Chemistry Courses: Teaching Practices and Assessments

ACS SYMPOSIUM SERIES 1330

Enhancing Retention in Introductory Chemistry Courses: Teaching Practices and Assessments Supaporn Kradtap Hartwell, Editor Department of Chemistry Xavier University Cincinnati, Ohio, United States

Tanya Gupta, Editor

Department of Chemistry & Biochemistry Avera Health & Science Center South Dakota State University Brookings, South Dakota, United States

Sponsored by the ACS Division of Chemical Education

American Chemical Society, Washington, DC

Library of Congress Cataloging-in-Publication Data Library of Congress Cataloging in Publication Control Number: 2019045863

The paper used in this publication meets the minimum requirements of American National Standard for Information Sciences—Permanence of Paper for Printed Library Materials, ANSI Z39.48n1984. Copyright © 2019 American Chemical Society All Rights Reserved. Reprographic copying beyond that permitted by Sections 107 or 108 of the U.S. Copyright Act is allowed for internal use only, provided that a per-chapter fee of $40.25 plus $0.75 per page is paid to the Copyright Clearance Center, Inc., 222 Rosewood Drive, Danvers, MA 01923, USA. Republication or reproduction for sale of pages in this book is permitted only under license from ACS. Direct these and other permission requests to ACS Copyright Office, Publications Division, 1155 16th Street, N.W., Washington, DC 20036. The citation of trade names and/or names of manufacturers in this publication is not to be construed as an endorsement or as approval by ACS of the commercial products or services referenced herein; nor should the mere reference herein to any drawing, specification, chemical process, or other data be regarded as a license or as a conveyance of any right or permission to the holder, reader, or any other person or corporation, to manufacture, reproduce, use, or sell any patented invention or copyrighted work that may in any way be related thereto. Registered names, trademarks, etc., used in this publication, even without specific indication thereof, are not to be considered unprotected by law. PRINTED IN THE UNITED STATES OF AMERICA

Foreword The purpose of the series is to publish timely, comprehensive books developed from the ACS sponsored symposia based on current scientific research. Occasionally, books are developed from symposia sponsored by other organizations when the topic is of keen interest to the chemistry audience. Before a book proposal is accepted, the proposed table of contents is reviewed for appropriate and comprehensive coverage and for interest to the audience. Some papers may be excluded to better focus the book; others may be added to provide comprehensiveness. When appropriate, overview or introductory chapters are added. Drafts of chapters are peer-reviewed prior to final acceptance or rejection. As a rule, only original research papers and original review papers are included in the volumes. Verbatim reproductions of previous published papers are not accepted. ACS Books Department

Contents 1. Student-Centered Teaching Practices and Assessments to Enhance Retention: An Introduction .  . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .  Supaporn Kradtap Hartwell and Tanya Gupta

1

2. Strategies, Techniques, and Impact of Transitional Preparatory Courses for At-Risk Students in General Chemistry .  . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .  15 Brian H. Augustine, Heather B. Miller, M. Todd Knippenberg, and Rachel G. Augustine 3. Implementing Metacognitive Writing in a Large Enrollment Gateway Chemistry Class.  . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .  49 Uma Swamy and Jennifer Bartman 4. Improving First-Semester General Chemistry Student Success Through Retrieval Practice .  . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .  69 Saul R. Trevino, Elizabeth Trevino, and Mary Osterloh 5. Scaffolding Underprepared Students’ Learning in General Chemistry I: Approach and Assessment .  . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .  77 Suely Meth Black 6. Adaptive Learning Technology in General Chemistry: Does It Support Student Success? .  . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .  91 Jessica M. Fautch 7. Introducing Components of Specifications Grading to a General Chemistry I Course 105 Langdon J. Martin 8. Preventing Mole Concepts and Stoichiometry from Becoming “Gatekeepers” in First Year Chemistry Courses .  . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .  121 A. M. R. P. Bopegedera 9. Adaptation and Assessment of a Gradual Release of Responsibility Model for a Large-Enrollment General Chemistry Course .  . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .  137 Nicole Lapeyrouse and Cherie Yestrebsky 10. Reflection Activities in General Chemistry Laboratories: An Active Learning Strategy to Connect Laboratories with Lectures .  . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .  147 Jayashree Ranga

vii

11. Changing Instructor Attitudes and Behaviors to Support Student Learning and Retention .  . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .  169 Supaporn Kradtap Hartwell 12. Strategies to Prevent Cognitive Overload: A Team-Based Approach to Improving Student Success and Persistence in a Gateway Introductory Chemistry Course .  . . . . . . . . . . .  187 Marguerite H. Benko, Keith M. Vogelsang, Kristin C. Johnson, and Allison R. Babij Acknowledgments .  . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .  201 Editors’ Biographies .  . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .  203 Indexes Author Index .  . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .  207 Subject Index .  . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .  209

viii

Chapter 1

Student-Centered Teaching Practices and Assessments to Enhance Retention: An Introduction Supaporn Kradtap Hartwell*,1 and Tanya Gupta*,2 1Department of Chemistry, Xavier University, Cincinnati, Ohio 45207, United States 2Department of Chemistry & Biochemistry, South Dakota State University,

Brookings, South Dakota 57007, United States

*E-mails: [email protected]; [email protected].

Quality education in STEM fields is a major global concern in academia. Problems stem from both students’ and instructors’ inadequacies, including their mindsets of learning and teaching. The pipeline of students entering college without effective study skills causes high attrition rate in introductory science and math courses. Unwillingness of the instructors to adjust their ways of teaching to support students will only make the retention matter worse. This book presents a collection of teaching practices, teaching activitites, and assessment strategies that the instructors from different types of institutions have tried in their introductory chemistry classes with the aim to reduce DFW rate and improve student retention in the programs. The authors of each chapter have shared the same challenge of student attrition and they have evaluated the effect of implementing different approaches in promoting successful completion of the classes. Classroom management and parameters that contributed to success and limitations are discussed in each chapter.

Current State and the Quality of STEM Education The number of students enrolling for science, technology, engineering, and mathematics (STEM) courses shows an upward trend. However the number of students actually attaining a STEM degree remains fairly constant in the range of 15-17 percent. There is a persistent gap in the number of freshmen students who declare to major in STEM areas versus those who actually complete a STEM degree. In its report on STEM teaching, the Association of American Universities (AAU) committee indicated that about 90% of students who changed majors from STEM to other areas cited poor quality of teaching as one of the contributing factors to their decision for changing the major. In addition, most students leave a STEM discipline within the first two years of college which also means that these first two years are crucial for student retention in STEM areas (1–3). © 2019 American Chemical Society

Several factors contribute towards improving the quality and effectiveness of college STEM instruction. These factors include: 1) a thoughtful, strategic and action oriented departmental shift from individual to collective responsibility for introductory level courses, 2) using institution wide data and analytical tools on student instruction and learning outcomes, 3) aligning the administrative resources and support services such as Centers for Teaching and Learning to better support departmental reform efforts in order to provide campus wide structures to support reform initiatives, 4) developing and re-designing learning spaces to move away from traditional instruction and towards collaborative learning to foster student interaction and engagement, 5) simultaneously managing the pursuit of effective teaching and the scholarship of teaching for assessment and evaluation of teaching, and 6) committing to systemic and long-term reforms by investing in institutional resources and aligning those resources to ascertain success of reforms. In addition, institutions should commit to change, actively seeking and engaging in collaborative opportunities with other regional and national agencies and communities of practice that foster such growth (3, 4). In the case of Discipline Based Education Research (DBER), it was found that chemistry courses present a challenge for STEM students (5). These courses suffer with the problem of high DFW rates and are often nicknamed as “weed out” or “gatekeeping” chemistry courses. For STEM majors, these first and second year courses are a prerequiste for other science curricula and hence they affect the overall success rates in STEM fields and interdisciplinary areas. Some of the main obstacles in science, and particularly chemistry education include students’ lack of motivation and low self confidence in learning, achievement gaps among different student demographic subgroups, and the high ratio of students to instructor in the class (6). In addition, the broken connection with other lessons, for example, between lecture content and laboratory experiments as well as other related subjects aggravates the problem and needs to be fixed (7). Such factors as race, language, culture, gender, physical and mental health, and socioeconomic status influence the knowledge and experience that students have as well as their abilities to learn. Content knowledge is important, however understanding the audience which is the students, and paying attention to factors that contribute to their success are equally important goals for a high quality of teaching. Instead of being a sage-on-stage, the role of instructor should be of a faciliator of student learning in a manner that supports diverse students in a class (8). Despite calls for change in the quality of teaching in college STEM courses, faculty members teaching undergraduate science and engineering courses are reluctant to making changes that actually benefit students due to various factors including lack of time, overwhelming work load, and lack of support from administration (9, 10). In most undergraduate college courses, student-centered evidence based teaching practices are restricted to a few classrooms and have not bloomed to the level as anticipated. Most colleges/ universities in the US have not yet reached or achieved the state of student-centered teaching.

Student-Centered Teaching Practices Instructors should take time to ask students about their needs in improving their learning, and make efforts to focus on the needs identified by the students. For example, students with busy part time job schedules may prefer video call office hours rather than meeting in person. Instructors should try to accommodate students’ requests even though this may mean having to learn new technologies. Being student-centered is not merely tied to instructor behaviors or actions in a classroom. It is more about instructor attitudes towards students and teaching practices. Studentcentered, also known as responsive teaching or individualized instruction, stems from the humanistic 2

educational philosophy that has the underlying goal of developing people who are open to change via learning, are willing to strive for self-actualization, and can work with one another as fully functional individuals. Thus, the role of instructor as per the humanistic philosophy is to develop the whole person by being a facilitator, helper or a partner in the learning process (11). Student-centered teaching starts before students enter a class. It includes the instructor’s prior training and understanding of quality instruction and effective instructional planning. This also rests on the assumption that instructors are at a high level of the mastery of the content and will be able to align teaching objectives and strategies with student diversity in the classroom by considering all viable options. A student-centered and responsive teacher periodically engages in self-reflection and assesses their own level of mastery of content and teaching practices to evaluate changes that need to be made to address student needs. This means that over a period of time, the instructor develops an integrative teaching style that blends respect, reasons, options and proficiencies for the content, and the strategies being used to create optimal learning conditions for the learners (12). This is a challenging feat as it demands time, resources, attention and knowledge of student diversity, administrative support, and understanding of the program objectives and expectations. Abundant evidence shows that the traditional old way of teaching i.e. regular “lecture only” is inadequate to help the new generation of students be successful in learning (13, 14). The National Science Teaching Association (NSTA) (7) lists four key strands in science education, namely 1) developing the ability to know, use, and interpret scientific explanation, 2) generating and evaluating scientific evidence, 3) understanding the development of scientific knowledge, and 4) learning to participate productively in scientific practices and discussions. These key components of high quality science education should have been implemented during K-12 and developed further at a deeper level in college. However, the fact is that the majority of students entering college do not possess basic effective learning skills, especially for math and sciences. The last component, in particular, points out to the need for active learning. However, active learning actvities can either decrease or increase anxiety in students, depending on how they are implemented (15). With the growing DFW rates, it is more important than ever for instructors, especially in introductory courses, to properly implement active learning strategies to improve educational outcomes. Instructors also should develop communication skills for effective interaction with students across generations and learning abilities. Among the effective methods of teaching and learning, student-centered teaching practices have gained prominence over the past several years. Students instructed using active learning and studentcentered approaches are more likely to succeed as compared to students instructed via traditional methods such as lecturing. Student-centered practices especially benefit disadvantaged, at-risk students, and females (14, 16, 17).

Instructors’ Role in Improving Retention Considering the main duties of teaching and assessment, and the high level of anxiety among freshmen in adjusting to college life (18), the instructors of introductory courses have an important role in supporting first year college students. Instructors should pay especially close attention and provide extra support to students during their transition time. It is necessary that instructors accept these responsibilities and be willing to develop rapport with students, and invest time and effort in exploring teaching pedagogies as well as alternative assessment methods that help support and encourage students’ learning.

3

It is well known based on education research that the instructor centered teaching style is not effective. It is important to move away from an old fashion informational class setting that focuses only on exam achievement. The instructors have an important duty of transforming students into lifelong independent learners. Therefore, students should be put into active roles to take responsibility of their own learning. However, this does not mean that only students should take full responsibility of their learning. Students are novice learners who need support and guidance from instructors to develop effective learning skills. Instructors should make use of their relationships with students to show a passion in learning, coach them to set goals for learning, and teach them to self-assess (19). Although, in the end, it is the student’s own attributes that are most important, the instructors share in the responsibility to encourage students to develop their persistance and perseverance on their journey through higher education.

Adjusting Teaching Practice and Assessment: Effort That Needs Good Intention and Courage As chemistry instructors, many of us have probably attended seminars and workshops by wellknown speakers, or read best-selling books on new teaching pedagogies and assessment methods that have claimed to successfully help to retain students and improve their learning. Yet we cannot help but feeling skeptical or too overwhelmed to try those new teaching practices that are based on education research outside STEM fields or those that originate from a certain type of institution different from our own. While we may feel that something seriously needs to be done to improve the academic situation for our students, many of us may hesitate to make any changes to our classes just to risk bad evaluation from students and peers. While the student mindset is important in effective learning, the instructor mindset and soft skills are also essential in effective teaching (20). In order to accommodate the everchanging new cohort of students, instructors must not stop cultivating and experimenting in innovative teaching strategies (21). Willingness to listen and reflect on one’s own as well as other instructors’ experiences will enhance and sustain our growth mindset. The biggest barrier to improving STEM education is the lack of knowledge about how to effectively spread the use of currently available and tested research-based instructional ideas and strategies. To an extent, this barrier has been addressed through the symposium “Enhancing Student Learning and Retention in Undergraduate “Gatekeeping” Introductory Chemistry Courses,” that was organized the past year by Dr. Supaporn Kradtap Hartwell from Xavier University, during the Biennial Conference in Chemical Education (BCCE-2018) that was held at the University of Notre Dame, South Bend, Indiana. Chapter contributions in this book present a journey of reform by pedagogy-activists. We are using this term to identify teachers or instructors who are not satisfied with the status-quo of their teaching, are concerned about the academic apathy of students, and are taking bold steps to reform their practice. These pedagogy-activists are crossing barriers to willingly share their efforts and outcomes with a wider audience. This book presents a collection of various teaching practices, class activities, and alternative assessment strategies that instructors from different types of institutions have tried in their introductory chemistry classes. They are examples of instructors putting good intention and courage into practice. All chapters in this book involve pedagogies used by the authors with input from their personal teaching experiences or classroom research conducted in real classroom settings to inform teaching. These teaching pedagogies are literature based practices that many of us may have already heard or read about, but have never tried or dared to try. Our colleagues have now shared their findings so that other chemistry instructors who would like to implement the same or similar 4

strategies can learn and reflect from these experiences. Whether you are a new college instructor or a very experienced one who inevitably feels disconnected with educating the new generation of students, the editors believe that you will find these peer reviewed chapters useful. At the least, you should find it mentally encouraging to realize the hard work our colleagues have performed. Each contributor to this book has a unique experience to share and readers will find a common ground, perhaps similar struggles and most importantly, new ideas to deal with the challenges of undergraduate chemistry teaching. A brief summary of each chapter in this book is presented next.

Chapters Overview Chapter 2: Strategies, Techniques, and Impact of Transitional Preparatory Courses for AtRisk Students in General Chemistry: Brian H. Augustine, Heather B. Miller, M. Todd Knippenberg, and Rachel G. Augustine Chemistry education research has pointed out that the lack of mathematical and problem solving skills are the main limitations keeping freshmen from successful completion of first year chemistry courses. The chapter identifies promising transitional approaches and provides a discussion on the importance of addressing the gateway chemistry problem for students at institutions serving a vast majority of undergraduates. Augustine and co-authors present efforts of reform at High Point University (HPU) in North Carolina via the Problem Solving in Chemistry (PSIC) course over a period of three years. At-risk students were identified using multivariate analysis that led to the creation of a separate transitional bridging chemistry course that employs the best practices in chemistry, problem solving, deliberate practice and metacognitive strategies. Using these interdisciplinary ideas, the authors present a bold argument that irrespective of the institution, it is very important to identify at-risk students at the bottom of a cohort. Further, first and foremost instructors and educators need to develop strategies to improve students’ academic self-concept for a meaningful change in learning and engagement. For an audience seeking information on the types of courses offered by various institutions, Augustine et. al. has provided an overview of both concurrent and bridging courses. Table 1 in this chapter provides a summary of various preparatory courses offered in several institutions with target populations, interventions, and outcomes listed. For an audience seeking information on strategies used in these courses, this chapter is a great resource with an in-depth literature review. The uniqueness of this chapter lies in the introduction and description of various approaches and measures used to identify at-risk students (Admissions Index, SATM scores, Placement Decision Matrix), and also in presenting results in a meaningful way (Progression Rate Plots, Sankey Diagrams). The chapter has several teaching strategies and approaches to measure the impact of these strategies such as growth mindset, deliberate practice, repetition, and dynamic study modules that were implemented in the bridging courses at HPU. Chapter 3: Implementing Metacognitive Writing in a Large Enrollment Gateway Chemistry Class: Uma Swamy and Jennifer Bartman Most students coming to college view a higher letter grade as a measure of their performance and often show a deep concern for their grades. It is rare for instructors to hear from the student side, questions such as “What are gaps in my knowledge?” or “ Based on this test, what can I do to make sure that I fill those gaps and develop my understanding of these topics?” Most students lack qualities relating to affective growth such as motivation, self-regulation and ability to reflect on their 5

current state of knowledge and efforts. These qualities are a major contributing factor for developing a coherent knowledge base and also responsible for student academic success. In their chapter, Swamy and Bartman present a study on the metacognitive strategies used with students throughout the semester. Students use the prompts provided by the instructor to reflect on their learning and study strategies, grades expected, exam preparation, and also analyze their own exam performance. Such strategies are helpful to ensure that students gain confidence in their own skills and connect their action and efforts with the outcomes including their success and failures to achieve a certain goal (anticipated grade in a course). The study was conducted at Florida International University (FIU) and would be of interest to an audience that seeks detailed information on the implementation of the metacognitive strategies, especially in a large class setting, to address affective domains of non-traditional college students. These students usually need to balance work and family responsibilities and are severely underprepared for meeting the demands of college. The general chemistry course described includes a combination of flipped teaching, active learning, and facilitation of these courses by trained undergraduate Learning Assistants (LAs). The chapter highlights that student beliefs about learning, which often come from prior classroom experiences, stand in their way of progress. Student performance on first exams exposes unrealistic expectations students build for themselves by their counterproductive study habits. The chapter includes extensive literature on metacognitive learning, self-regulated learners and connection of these ideas with student success. The authors have also provided an overview of several metacognitive strategies. Examples of a series of cognitive, metacognitive, and mixed prompts used regularly and periodically throughout the semester as well as details about administration of these prompts should be very useful for other instructors who would like to develop metacognitive activities for their own classes. Chapter 4: Improving First-Semester General Chemistry Student Success Through Retrieval Practice: Saul R. Trevino, Elizabeth Trevino, and Mary Osterloh As exemplified in the prior chapter, first year college students often lack independent learning skills. One way to address this gap is to incorporate metacognitive strategies in college courses. Trevino and co-authors present a short study on the implementation of retrieval practice coupled with metacognitive prompts worksheets that were used by students in a first semester general chemistry course at the small and ethnically diverse Houston Baptist University in Texas. Education researchers in the language field claim that “retrieval practice,” or the practice of self-testing, helps to improve long-term retention of the learning material. The practice requires re-study of the material and re-testing oneself as many times as necessary on questions missed on the previous test. The modified Retrieval Practice approach used in this study for chemistry class by Trevino et. al. is a refined version of the Karpicke and Roediger Method that involves student self-assessment of a topic and peer review of student performance in 50-minute class sessions that were facilitated by the instructor. The chapter describes the authors’ attempt to instill in the first semester general chemistry students this learning strategy that may help them to become more independent and more effective learners. By implementing the practice in the classroom, students should have a clear example of how to conduct this learning strategy on their own outside of class. The authors present examples of the retrieval practice materials which should be easily adapted for use in other institutions. The study shows positive impact of Retrieval Practice on student retention via iterative implementation. Authors particularly bring forth the limitations to the study which can help the audience considering such studies to be cognizant of class-time and the learning material being used. 6

Chapter 5: Scaffolding Underprepared Students’ Learning in General Chemistry I: Approach and Assessment: Suely Meth Black According to the President's Council of Advisors on Science and Technology (2), out of the 30% underrepresented racial and ethnically diverse students, only 9% of those students go on to constitute the science and engineering workforce. Historically Black Colleges and Universities (HBCUs) are major contributors to this pool of underrepresented STEM graduates. Problem solving, quantitative reasoning, and modelling skills are important for student success in chemistry and other science courses as well as for future science careers. However current practices in college chemistry courses fall short of achieving the goal of developing a STEM literate workforce. In this chapter, Black describes the efforts towards best practices and the metacognitive approaches used to reach out to diversely abled students. The chapter is focused on the first semester general chemistry course at Norfolk State University (NSU) which is a HBU in Virginia with a population of 5,000 undergraduate and graduate students. The chapter includes a systematic overview and implementation of teaching practices that were adopted at NSU and the impact of these practices on student learning and performance. The challenges for first year chemistry courses remain the same across diverse institutions. These challenges include differing levels of mathematical abilities among students, lack of chemistry background, poor reading and comprehension skills, poor or inconsistent study habits, lack of ability to reflect on one’s efforts and plan for success, and the need to balance academic life with work, family or other personal aspects of life. Through this chapter, Black highlights that irrespective of the outcome, it is important to use multiple practices and strategies to reach out to underprepared students. The chapter provides a detailed overview of formative and summative assessments that were used throughout the semester, and the use of activities that emphasize deliberate practice and trial and error that were paced with a uniform delivery of chemistry content. The chapter also describes a modified approach to Peer-Led Team Learning (PLTL) for out of class scaffolding of student learning. Such learning experiences especially benefit low-performing students and help improve the overall pass rate and student success. Although using diverse strategies does not solve all problems, it does help address some challenges of underprepared students. Chapter 6: Adaptive Learning Technology in General Chemistry: Does It Support Student Success?: Jessica M. Fautch Technology has become the norm in our daily life and it has influenced how students learn and how instructors teach. There are many commercial platforms that aim to help instructors of general chemistry improve student performance. Most of them emphasize interactive online homework to ease the instructors’ burden and to help students make progress at their own pace. This chapter by Fautch is focused on adaptive learning through technology in a first semester general chemistry course at York College of Pennsylvania. The author conducted a pilot study to investigate the impact of one of the popular commercial adaptive homework systems called Assessment and Learning in Knowledge Spaces (ALEKS). An adaptive system caters to the need of an individual student by providing tailored learning experience that relies on the ability of the student. Motivated students can pace forward and students who are struggling with foundational knowledge can get more practice until they are ready to learn advanced content. As compared to the control groups, the author shows through the common exam questions, final course grades, and student perceptions that the system has positive impact. It seems that using adaptive technologies, if properly implemented, may have better outcomes for underrepresented 7

minorities and also female students who may be experiencing a belief gap (a feeling that one does not belong in science classes). This work is an example of the way the instructor can investigate whether the investment is worth a long term implementation which can help the department make a better decision. The information in this chapter should be useful for instructors who are considering using the commercial online tools for teaching. Chapter 7: Introducing Components of Specifications Grading to a General Chemistry I Course: Langdon J. Martin This chapter is mainly focused on the assessment of student learning. Student success in chemistry, or other courses, is commonly measured based on the student performance during the semester on various tasks and assignments. The letter grades (A, B, C, D) reflect the extent of student success as a weighted average of student performance. This is the traditional approach to assessing student academic performance and is very common in higher education institutions. Sometimes grading can be difficult, and even with the rubrics, it may still be hard to decide how much partial credit should be given. Martin shares his experience of implementing an alternative assessment strategy so called “specifications grading” in the first semester of general chemistry at Warren Wilson College in North Carolina. In specifications grading, achievement of learning objectives and mastery of content by the students is the main focus. The author presents both the benefits and drawbacks, including the preparation needed, to shift to the specifications grading system, as well as covers student motivation and student success. The chapter presents strategies embedded in specifications grading as a better alternative to a partial credit system. The author’s discussion is very informative and should be very useful for instructors who are deciding whether and how to implement this alternative grading system in their courses. For an audience considering the use of specifications grading, Table 1 in this chapter compares the traditional and specifications grading with respect to familiarity, course structure, summative assessment, and final grade calculations. Martin discusses Backward Design for developing the course syllabi (specifically for chemistry instructors). The chapter provides a detailed overview of the implementation of specifications grading with emphasis on developing high stakes assessments consistent with the specifications grading system. It covers the key aspects/considerations for developing these assessments and the value of planning ahead so that students have multiple opportunities for assessment to meet the learning objectives and to gain mastery. Table 2 in this chapter would be of interest to an audience seeking an example of a general chemistry course wherein both the essential and ordinary learning outcomes are presented. The chapter demonstrates a value in providing students an opportunity to reflect on their performance, and also for instructors to have a flexible mindset toward adapting and refining their assessment to maximize learning opportunities for students. Chapter 8: Preventing Mole Concepts and Stoichiometry from Becoming “Gatekeepers” in First Year Chemistry Courses: A. M. R. P. Bopegedera In order to support students during their first year chemistry class, some instructors focus on specific topics that are fundamental to student success in chemistry. Mole Concepts and Stoichiometry (MC&S) is a difficult topic for students who have little or no background in chemistry, but it is an important topic that can help students develop a deeper appreciation for molecular sciences as it draws connections between the macroscopic world (experiences) and the microscopic world (atoms and molecules). As topics in chemistry become more challenging, there is a 8

simultaneous decline in student interest and attendance in classrooms and laboratories. Instructors can make challenging topics both interesting and engaging for students by seeking innovative, active learning-oriented ways to teach complex ideas. Poor performance by students in these areas needs to be tackled head on. This chapter highlights Bopegedera’s efforts and experience in developing and implementing a series of worksheets on the challenging topic of MC&S. The worksheets were implemented at Evergreen State College, a Primarily Undergraduate Institution (PUI) located in Olympia, Washington. The chapter provides a rational and overview of the development of the active learningoriented worksheets that replaced traditional lecture on the topic of MC&S. The worksheets were carefully designed with questions and examples on everyday experiences of students and includes scaffolding questions to help students draw connections between chemistry and life. The worksheets are chunked down to single topics related to the big idea and emphasize conceptual understanding through incremental scaffolding. The advantages of using these worksheets with students and the impact of worksheets on student performance on MC&S questions during semester exams, and on ACS general chemistry exams are presented. Bopegedera also discusses the implementation of worksheets along with the student-centered cooperative learning approach and the importance of having the right classroom environment and physical space for a measurable impact. Chapter 9: Adaptation and Assessment of a Gradual Release of Responsibility Model for a Large-Enrollment General Chemistry Course: Nicole Lapeyrouse and Cherie Yestrebsky The effectiveness of active learning over traditional lecture teaching styles has long been realized through numerous education research. In an active learning environment, instructors use teaching strategies that engage students in the learning process through various activities besides passive listening. Active learning can take many forms, ranging from short and simple activities (e.g. solving short problems individually or in small groups, and paired discussions), to longer and more complicated tasks (e.g. case studies, and structured team-based learning). Some instructors may be skeptical of using active learning strategies in place of traditional lecturing, especially in a large class. Implementing active learning strategies does not require completely abandoning the lecture format. In fact, just by adding small active learning activities during the lecture can significantly make the lessons more effective for student learning. These short activities can be simply giving students a minute or two to check their understanding of recent material, and providing practice questions that allow students to realize gaps in their knowledge before giving an explanation. In this chapter, the authors have implemented the active learning approach so called “Gradual Release of Responsibility (GRR) teaching model” in the general chemistry course at the University of Central Florida. The dominant feature of this approach is to slowly release the responsibility from the instructor to the student over a period of time or even within a single lesson. This is done through a four-phase model, referred to as “I do it (focused instruction), we do it (guided instruction), you do it together (collaborative learning), and you do it independently (independent learning). Each phase slowly releases the support of the instructor to the student which allows the learner to become more independent and be able to tackle new problems. Importantly, the author demonstrated how the GRR teaching model can be adapted to suit a large-enrollment course. Methods of assessment as well as the attitudes of students in the class are described. A decrease in DFW rates, an increase in students obtaining higher grades, and positive student attitude toward this teaching model should help make it easier for other instructors to decide whether to adopt this teaching approach.

9

Chapter 10: Reflections in General Chemistry Laboratories: An Active Learning Strategy to Connect Laboratories with Lectures: Jayashree Ranga Hands-on experiences significantly advance learning in all areas of study. In STEM fields, conducting laboratory experiments is critical to the learning process especially in developing problem-solving and critical-thinking skills. In fact, most instructors would agree that there are no equivalent substitutes for hands-on laboratory experiences. In chemistry, laboratory experiments also allow students to gain exposure to various chemical reactions and scientific equipment. Carefully designed experiments should inspire students to further their education and prepare them for future careers. The laboratory and lecture courses should naturally support each other and enhance student learning. However, the fact is that most chemistry students do not realize the connection between laboratory and lecture content. Even though general chemistry laboratory is usually taken concurrently with lecture, the majority of students cannot clearly see their relevancy, as if the two courses are not related. One reason may be due to the imperfect alignment of the lecture content and the experiment of the related topic. The author, Ranga from Salem State University in Massachusetts, tried to help students see the connection by having them work on a reflection activity at the end of each experiment. The aim was to promote student discussion of those connections between the experiment they just carried out and the content they learned previously in the lecture. This active learning strategy helped to improve average final exam scores in the lecture courses. Example prompts shown in this chapter should be very useful for instructors who are struggling with this same issue of disconnectivity of laboratory and lecture. Synchronization of weekly lecture content and the experiment, and having the same instructor for both courses can help enhance the success. Chapter 11: Changing Instructor Attitudes and Behaviors to Support Student Learning and Retention: Supaporn Kradtap Hartwell The student retention issue is very complicated and involves many parameters other than the students’ attributes. Students’ attitude toward the learning environment, including their level of comfort with instructors, staff, and peers is also very important. The author reminds us that the frequency of interaction between instructor and students in and outside of classes puts the instructors in an important role in improving retention. In the effort of improving retention, many times instructors focus only on what students are lacking rather than on what they themselves are lacking. Even highly experienced instructors should not overlook the importance of sharpening their soft skills for effective communication with their students who are from different generations and from diverse backgrounds. The author, who teaches at Xavier University, Cincinnati, Ohio, persuades colleagues to examine themselves, whether they have adjusted their attitudes and behaviors in order to serve their students as best as they can. The importance of developing compassionate mindset is emphasized. Some questions are raised for self-examination on the level of compassion toward students. Some practices, accumulated from research articles, as well as from other instructors and the author’s own experiences, are suggested for use in day-to-day operations. The author stresses the finding from education research that, rather than changing instructors’ attitudes first, instructors should first change teaching practices and behaviors while monitoring the impact of these changes. This is because only after the instructor realizes the positive impact of certain behaviors and practices can the change in the instructor beliefs and attitudes on teaching be sustained.

10

Chapter 12: Strategies to Prevent Cognitive Overload: A Team-Based Approach to Improving Student Success and Persistence in a Gateway Introductory Chemistry Course: Marguerite H. Benko, Keith M. Vogelsang, Kristin C. Johnson, and Allison R. Babij One of the difficulties in quality control in education is the inconsistent teaching standard and diverse levels of expectation by different instructors, especially those teaching the same subject in different sections. As pointed out by Benko and co-authors of this chapter, they found large discrepancies in the amount of content covered in various sections of the same chemistry course, with some instructors overwhelming their students with content in greater detail than required by the core objectives. As educators, we have a responsibility to take a systematic assessment at what we are doing and the results we are achieving. We must focus on student outcomes, and as challenging as it is, it is important to have consensus among instructors of different sections on what exactly to cover by developing well-focused class learning objectives. This chapter is an excellent example of team work and collaboration among instructors in improving student retention at Ivy Tech Community College in Indianapolis, Indiana. All aspects of class in all sections were examined for both curriculum and pedagogy modifications, including core objectives, laboratory type and content, and class time management. The team revised the curriculum to have all the instructors better align their teaching among the different sections. The well-focused traditional lectures with access to supporting practice problems, and laboratories that clearly connect with each week’s learning objectives, helped to improve the success rate from 50% to 60%. When the instructors further adjusted the teaching approach by using active and cooperative learning, the overall success rate increased to 75%. The chapter shows an exceptional success that helps to erase the myth about ineffective teaching by adjuncts, as this instructor team at Ivy Tech is composed of adjunct faculty only. They show the impact of the instructors’ mindset, the feeling of ownership for the program, and the support from the administration which is a great model for all institutions.

Conclusions or Looking Forward? Introductions are a journey on which to look forward and venture into the details and depths of the new. These cannot be referred to as conclusions. The purpose of this introduction chapter was to bring the problem (or issues) of college preparedness to the forefront. It is well known, that fewer than 50% of the students with a declared STEM major in college actually complete their degree. It seems that the first two years of college are the most challenging for students. A heavy reliance on math and foundational science courses during this period weeds out unprepared students from STEM majors (22–25). A key aim of STEM education reform efforts is to enhance the persistence of students in STEM areas and to close the achievement gap among underprepared students. There are several problems that are standing in the way of achieving this goal. As shown in the introduction via chapter summaries, delving into issues or problems is important if we are seeking a meaningful, long-lasting change in undergraduate chemistry instruction. Such a change relies on a) understanding what it means to be a student-centered or responsive teacher, b) efficient planning, c) using tested, evidence-based strategies and approaches, d) being data-driven and ready to measure planned changes to classroom practice, e) being ready to learn not just from one’s own experience but also be willing to learn from that of others through collaboration and dialogue, f) being reflective of one’s own practice and cognizant of the research literature on teaching and learning, g) willing to share individual experiences and learning with others, h) refining and updating the knowledge, approaches to assessment and measurement of one’s 11

own teaching, and i) last but not least, being willing to ask for help and seeking resources to advance one’s horizon as a teacher. Through this book, we have not only highlighted the major issues that plague undergraduate chemistry education, we have also identified and outlined the various aspects of high quality studentcentered teaching. There seem to be too many efforts going on to reach out to and address the issues of underpreparedness, and to decrease the DFW rates in general chemistry. The goal here is not to increase the anxiety of dedicated instructors who are already juggling too many responsibilities in addition to teaching duties. Our goal is to help chemistry instructors find the strategies and opportunities that align with their classroom. We understand that it can be an overwhelming experience for instructors to make sweeping changes to their classroom practice. Instead, small onestep changes can also show signs of progress which can be measured in simple ways. Several teaching approaches, strategies and assessment methods are presented in this book for the consideration of the audience to make those small measurable changes. Most instructors want to find ways to engage students with the material in their courses in order to promote deep learning and student retention. As demonstrated in various chapter summaries, using metacognitive approaches to teaching chemistry may help students. These approaches focus on how well the students know the content, their own habits of learning, if the current state of knowledge is superficial or deep, and whether students are thinking about their knowledge, actions, and efforts to actively engage with the content in various settings. Some instructors may want to focus on improving or developing teaching activities that yield similar outcomes of engaging students in active learning. Every effort made by the instructor to reach out to students counts towards shattering the status quo of academic apathy for underperformance. These efforts, whether small or big, matter as they become a turning point for students who may otherwise continue to struggle. Caring instructors will try to do something to deal with the learning challenges that their students are facing. We hope that the content of this book will be useful for the readers and can inspire chemistry instructors elsewhere to become “pedagogy activists” who continue to enhance the quality of STEM education.

References 1.

2.

3. 4.

5.

6.

Federal Science, Technology, Engineering, and Mathematics (STEM) Education 5-Year Strategic Plan: A Report from the Committee on STEM, Education, National Technology Council; National Science and Technology Council: Washington, DC, 2013. President’s Council of Advisors on Science and Technology (PCAST): Report to the President, Engage to Excel: Producing One Million Additional College Graduates with Degrees in Science, Technology, Engineering, and Mathematics, 2012. Association of American Universities. Progress Toward Achieving Systematic Change: A Five Year Status Report on the AAU Undergraduate STEM Education Initiative; Washington, DC, 2017. Ulriksen, L.; Madsen, L. M.; Holmegaard, H. T. What Do We Know about Explanations for Drop Out/Opt Out among Young People from STM Higher Education Programmes? Studies in Sci. Educ. 2010, 46, 209–244. National Research Council (NRC). Discipline-Based Education Research: Understanding and Improving Learning in Undergraduate Science and Engineering; The National Academies Press: Washington, DC, 2012. Kaptan, K.; Timurlenk, O. Challenges for Science Education. Procedia: Social and Behavioral Sciences 2012, 51, 763–771.

12

7. 8. 9.

10.

11. 12. 13. 14.

15.

16. 17.

18.

19. 20. 21. 22. 23.

24.

National Science Teaching Association, Professional Learning: Quality Science Education and NSTA. https://www.nsta.org/sciencematters/qualityscience.aspx (accessed August 30, 2019) Sellers, S.; Roberts, J.; Giovanetto, L.; Friedrich, K., Hammargren, C. Reaching All Students, 2nd ed; The Board of Regents of the University of Wisconsin System, 2007. Henderson, C.; Beach, A; Finkelstein, N. Facilitating Change in Undergraduate STEM Instructional Practices: An Analytic Review of the Literature. J. Res. Sci. Teach. 2011, 48, 952–984. Silverthorn, D.U.; Thorn, P.M.; Svinicki, M.D. It’s Difficult to Change the Way We Teach: Lessons from the Integrative Themes in Physiology Curriculum Module Project. Adv. Physiol. Educ. 2006, 30, 204–214. Nuckles, C. R. Student Centered Teaching: Making It Work. Adult Learning 2000, 11, 5–6. Ellias, J.; Merriam, S. B. Philosophical Foundations of Adult Education; R. E. Krieger Publising Company: Malabar, FL, 1980. Lujan, H. L.; DiCarlo, S.E. Too much teaching, not enough learning: what is the solution? Adv. Physiol. Educ. 2006, 30, 17–22. Lamba, R. S. Inquiry-Based Student-Centered Instruction. In Chemistry Education: Best Practices Opportunities and Trends; Garcia-Martinez, J., Serrano-Torregrosa, E., Eds.; WileyVCH: Germany, 2015; Ch. 12, pp 301−318. Cooper, K.M.; Downing, V.R.; Brownell, S.E. The influence of active learning practices on student anxiety in large-enrollment college science classrooms. Int. J. STEM Educ. 2018, 5, 23. https://doi.org/10.1186/s40594-018-0123-6. Beistel, D.W. A Piagetian Approach to General Chemistry. J. Chem. Educ. 1975, 52, 151–152. Abraham, M. R. The learning cycle approach as a strategy for instruction in science. In International Handbook of Science Education; Tobin, K., Fraser, B., Ed.; Kluwer: The Netherlands, 1998; pp 513−524. Wei, M.; Russell, D.W.; Zakalik, R.A. Adult Attachment, Social Self-Efficacy, Self-Disclosure, Loneliness, and Subsequent Depression for Freshman College Students: A Longitudinal Study. J. Counseling Psychol. 2005, 52, 602–614. Wilson, D.; Conyers, M. Guiding Students to Be Independent Learners, 2018. https://www. edutopia.org/article/guiding-students-be-independent-learners (accessed August 30, 2019). Wacker, C.; Olson, L. Teacher Mindsets: How Educators’ Perspectives Shape Student Success. FutureEd, Georgetown University, June 2019. Brookes, R.; Goldstein, S. The Mindset of Teachers Capable of Fostering Resilience in Students. Can. J. School Psychol. 2008, 23, 114–126. Chen, X.; Soldner, M. College Students’ Paths Into and Out of STEM Fields; NCES Report; U.S. Department of Education: Washington, DC, 2014. Barr, D. A.; Gonzalez, M.; Wanat, S. F. The Leaky Pipeline: Factors Associated with Early Decline in Interest in Pre-Medical Studies among Under-Represented Minority Undergraduate Students. Acad. Medicine 2008, 75, 743–747. Barr, D. A.; Matsui, J.; Wanat, S. F.; Gonzalez, M. E. Chemistry Courses as the Turning Point for Premedical Students. Adv. Health Sci. Educ. 2010, 15, 45–54.

13

25. National Research Council (NRC). Committee on the Foundations of Assessment. Knowing What Students Know: The Science and Design of Educational Assessment; Pelligrino, J., Chudowsky, N., Glaser, R., Eds.; The National Academy Press: Washington, DC, 2001.

14

Chapter 2

Strategies, Techniques, and Impact of Transitional Preparatory Courses for At-Risk Students in General Chemistry Brian H. Augustine,*,1 Heather B. Miller,*,1 M. Todd Knippenberg,1 and Rachel G. Augustine2 1Department of Chemistry, One University Drive, High Point University,

High Point, North Carolina 27268, United States 2Department of Statistics and Operations Research, 318 Hanes Hall, University of North Carolina at Chapel Hill, Chapel Hill, North Carolina 27514-3260, United States *E-mails: [email protected]; [email protected].

This chapter will discuss a bridging course called “Problem Solving in Chemistry” developed in 2017 at High Point University (HPU) for students identified as being at-risk based on a historical control group using a multivariate approach using data from the Office of Admissions at HPU. The course was designed using deliberate practice, growth mindset, and other metacognitive strategies to impact student performance on algebraic and conceptual understandings which are foundational to success in general chemistry. In addition, the course aimed to enhance academic self-concept and the affective domain of learning. In this chapter, we review the literature of some of the most promising transitional course approaches, and discuss why solving the gateway chemistry problem for students at less selective institutions serving the vast majority of undergraduate students may be one of the grand challenges facing undergraduate chemistry educators.

Introduction General chemistry has often been described as a critical gateway course enabling students who succeed to progress into more advanced science, technology, engineering and mathematics (STEM) degree programs (1–5) as general chemistry is a required cognate course for a broad range of undergraduate STEM majors. A large cohort of students required to complete at least one and often several university general chemistry courses are students pursuing degrees in medical and allied health fields. Certain programs such as premedicine, predental, preveterinary, and prephysician’s assistant studies often require up to five chemistry courses including one year of general chemistry, one year of organic chemistry, and one semester of biochemistry. Persistence in these degree programs are predicated on achieving a minimal passing grade (typically a C or higher grade). Several studies have noted that performance in the first semester of general chemistry is broadly predictive © 2019 American Chemical Society

of continuing success in subsequent chemistry courses since the topics covered in these subsequent courses build upon the hierarchical foundation established in the first semester general chemistry course (6–8). Typically, this first-semester chemistry course has large enrollments with a wide range of student background and preparation in science and mathematics upon graduation from high school. In a typical first-semester general chemistry course, it is not unusual to have students enrolled who are at least three years removed from having taken a basic high school chemistry class, as well as students who have taken Advanced Placement (AP) or International Baccalaureate (IB) chemistry, college dual-enrollment, and other college-level STEM courses. This creates a significant preparation gap in the background and experiences of students enrolled in the same class. Another issue which is much harder to quantify is the quality of secondary school science and math instruction, which varies widely across school districts and results in a highly heterogeneous distribution of students enrolled in a typical university general chemistry classroom. General chemistry can also be considered a gateway course because chemistry educators have reported in the literature and anecdotally a stubbornly high DFW rate (defined as the percentage of students who receive a D, F or withdraw from the class) in introductory chemistry courses, meaning that students are not able to persist in their desired majors. Reports of DFW percentages in excess of 40% (4) within a university are not unusual despite a concerted effort by large numbers of institutions to lower this through best pedagogical practices grounded in discipline-based educational research (DBER) (9). Losing 40% or more of the students enrolled in general chemistry represents a tremendous waste of intellectual capital, institutional resources and a severe retention risk for universities. This problem is particularly acute and has been widely reported for the persistence of underrepresented minorities (URM) in STEM (10–13). Perhaps most troubling, these trends exist across the undergraduate educational spectrum from the most to least selective institutions, private vs. public, and doctoral vs. primarily undergraduate. It seems that no institution is immune to this vexing problem, and solving the gateway course problem for a given major may, in fact, be one of the grand challenge facing undergraduate STEM educators. In this chapter, we will report on efforts made at High Point University (HPU) during the past three years to identify at-risk students using multivariate analytics, and to create a separate transitional bridging chemistry course employing best practices in pedagogy (9), problem solving strategies using deliberate practice (14, 15), and metacognitive strategies (16, 17). The first cohort of students who were enrolled in the bridging course in the 2017-2018 academic year completed the traditional general chemistry course in December of 2018 with no measurable improvement in student performance in terms of final course GPA or performance on the first-term American Chemical Society (ACS) general chemistry exam. In light of the lack of statistically significant improvements in student performance, we will review the overall literature of transitional preparatory chemistry courses and discuss the major findings of other institutions, as well as our own. We will also attempt to identify techniques and interventions which appear to be most promising and those that are less successful according to the reported literature and our experience. Finally, we posit that interventions which seem particularly promising at highly selective institutions may not be readily transferable to the vast majority of institutions due to significant differences in their respective undergraduate student populations. Further research needs to be performed to determine if the reported successful strategies are more universally applicable to all chemistry instruction, or are limited in scope to the more homogeneous student population present at highly selective institutions.

16

The Murkiness of College Admissions Exam Data It has been widely reported since as early as the 1920s (18), that it is possible to successfully identify at-risk students prior to enrollment in introductory chemistry courses. In the past forty years, means of identifying at-risk students has typically fallen under four major categories. Namely, these were performance on standardized college admissions exams such as the SAT and ACT, especially the math component of these exams (1, 3, 19–24), a chemistry and/or math placement exam (4, 25), the first exam in general chemistry (26), or a multivariate approach (27) involving a combination of some of the aforementioned metrics. While there is no universal agreement about the best way to identify students as being at-risk, it has been repeatedly shown that there is a medium to strong correlation between performance on the math portion of the SAT or ACT exam (subsequently referred to as SATM) and the final grade in general chemistry with a reported variance of between R2 = 0.26 to 0.61 across multiple studies (3, 23, 24, 28–30). Because of the fact that the SAT/ACT is widely taken by students for college admissions, these are exams that the students take seriously, and they do not require the logistical challenge of additional scheduling for placement exams during the summer or at the beginning of the term, many institutions have used SATM as a primary tool for identifying at-risk students when designing intervention strategies. Meta-analysis of the literature of cognition has shown a correlation of persistence in college and cognitive ability using both SAT and high school GPA (31). While it has not been adopted universally, several recent reports have identified students as at-risk and devised interventions based on students in the bottom quartile to third for a given cohort of students at institutions ranging from highly selective to less selective (2, 24, 32, 33). This is a point that is somewhat counterintuitive as it could be reasoned that if a student is able to pass the rigorous admission standards of a highly selective institution (here defined as less than a 25% acceptance rate), then that students’ math ability is surely sufficient to succeed in the algebra required in general chemistry. Thus, highly selective institutions should not face the same attrition rates in general chemistry as those at less selective schools if math ability is the primary indicator of success. The Big Fish-Little Pond Effect (BFLPE) and Individual Relative Deprivation (IRD) Effect in Gateway Chemistry Courses Literature has shown that dividing cohorts by SATM is not an entirely arbitrary decision. An intriguing study in the 1990s reported that students initially enrolled in a STEM degree program in the bottom third of any given class had only a 15% chance of graduating with a STEM degree, while those in the top third had over a 50% chance of successfully persisting to a STEM degree (34). What makes this study particularly interesting is the authors showed SATM data for eleven different unnamed institutions later identified to be from the most selective Harvard University to not selective Hartwick College (NY) (35), and the STEM graduation rates were essentially identical for all eleven institutions regardless of admissions selectivity. In fact, the average SATM score of students in the bottom third at Harvard University (SATM = 581) was higher than the top third at Hartwick College (SATM = 569), yet the STEM graduation rates were 40% lower. (It should be noted that SAT scores have been recalibrated higher several times since this was originally published and cannot be directly compared with SATM scores from more recent papers and in this chapter.) A possible explanation of this observation has been termed the “Big Fish-Little Pond Effect” (BFLPE) and is a widely recognized construct in educational psychology in helping to explain the development of a students’ academic self-concept (ASC) and frame-of-reference effects (36–39). The BFLPE 17

has been implicated as an important means of understanding why students in the bottom range of a particular cohort tend to feel inferior to their peers regardless of their absolute standing using national norms or percentiles (38). While not universally accepted, the BFLPE has been reported and noted in cross-cultural settings around the world (40). The result is that in classes, on exams, and in discussions, these students feel as if they are the only ones in the class who do not understand a challenging topic, and that others seem to grasp the topic much easier than they do. Further, it does not matter how selective the institution, as there will always be a bottom cohort of students, and thus there will always be students who experience this effect. Related to the BFLPE is a concept from the sociology literature with a long history of explanatory power called the Individual Relative Deprivation (IRD) effect that fits into a broad category of theories used to explain relative social and psychological comparisons made by individuals (41). If a person feels that they are deprived relative to peers, it will result in resentment and anger. While the magnitude of the BFLPE and the IRD effects are not entirely settled in the psychology and sociology literature, if these effects are true, this would suggest that regardless of the institution, it is not without merit to identify at-risk students as being in the bottom cohort of students and to devise strategies to improve students’ academic self-concept. Whether the cut-off is the bottom quartile or third is not the important factor, it is identifying students who are likely to struggle relative to their peers and most importantly, to provide successful interventions to overcome this reality. Addressing the Issue of College Preparation: A Brief Literature Review on Preparatory General Chemistry Courses In order to improve the outcomes of potentially at-risk general chemistry students, an extensive range of evidence-based interventions have been reported in the literature, and reviewing the entire breadth of strategies is beyond the scope of this chapter. The reader is referred to a detailed recent comprehensive review showing the breadth of approaches chemistry educators have taken (9). The focus of this chapter, specifically, will be on the types of preparatory courses that have been reported. Within the category of preparatory general chemistry courses, there has been a range of possibilities, although there are two major types of preparatory courses that have been devised for students identified as being at-risk; (1.) courses that are run concurrently with the first semester general chemistry course. We will use the terminology adopted by Lee and refer to these as “service courses” (5), and; (2.) courses that are run as a separate course prior to admission into the first semester of general chemistry, which we will call “bridging courses” again, consistent with Lee (5). It should be noted that there have been a few reports of summer bridging courses, but these have not been a significant focus in the literature (42–44) and will not be reviewed here. Table 1 identifies the various preparatory chemistry courses reported in the literature, what criteria was used for defining at-risk, which intervention type was adopted, the claimed outcomes of those studies, and where available, the conclusion of a recent meta-analysis of these studies as to whether the results were statistically significant (SS) or not statistically significant (NSS) (5).

18

Table 1. Literature summary of preparatory courses by type Year

Ref.

Target Population

Service

Columbia

1975

19

SATM < 610

Florida State

1995

72

All students

b

Retention ; GPA in GC-I

NSS

Michigan State

1996

7

URM ; Underprepared

c

Grade Distribution in GC-I

NSS

Grinnell and UC-Berkeley

2001

71

All Students

d

Exam scores ; Interviews

NSS

UT-Austin

2001

70

SATM < 462

b

Grade Distribution in GC-I

NC State

2004

54

All students

a,b,e

Exam Scores

U Ill. – Urbana-Champaign

2007

12

URM ; Underprepared

a, b, c

GPA in GC-I

South Florida

2008

3

All Students

b

In-Class exam ; ACS exam

SS

Washington-St. Louis

2008

50

All Students

b

GPA in GC-I

SS

Washington-St. Louis

2012

27

Suggested for At Risk

b

GPA in GC-I

SS

South Florida

2016

68

SATM < 515

c

Cornell

2018

5

All Students

Johns Hopkins

2018

32

Suggested for At Risk

Ohio State University

1974

80

Underprepared

Western Kentucky

1976

8

ACS Toledo Exam

Toledo

1977

78

Oakland University

1983

U Illinois – Chicago

19

Institution

Bridging

Intervention a, g

Outcome

Meta-Analysis (5)

Program Type

GPA in GC-I

--

-NSS

Final Exam Scores

--

b,e

GPA in GC-I and GC-II

SS

e

GPA in GC-I and GC-II

--

Student Survey

--

f

GPA in GC-I

--

ACS Toledo

g

GPA in GC-I and GC-II

--

66

Placement Exam

a,f

Not Reported

--

2000

67

Voluntary

a,f

GPA in GC-I and GC-II

--

Texas Tech

2005

4

Placement Exam

a

GPA in GC-I

SS

UMass – Amherst

2007

103

Voluntary

h

Hunter College

2009

27

UW-Milwaukee

2012

79

Duke

2014

2

SAT Quartile

d, f

GPA in GC-I

--

First Exam Grade

a, f

None related to GC-I

--

Math Placement

e; no lab

Cognitive Outcomes

--

Retention

SS

b, c

Table 1. (Continued). Literature summary of preparatory courses by type Program Type

a Emphasis

Institution

Year

Ref.

Target Population

Benedictine

2018

33

ACTM < 23

Intervention a,b

Outcome GPA in GC-I

Meta-Analysis (5) --

on problem-solving techniques / harder problems / chemical vocabulary; b Cooperative learning / peer-led group instruction / peer-led teaching c Soft study skills such as emphasis on attendance, study techniques ; d Modular contextual examples ; e Active learning / flipped classroom / combined lecture-lab f Fewer topics / math emphasis ; g Pass/Fail Course ; h Self-led study. SS (statistically significant), NSS (not statistically significant) according to Ref. (5).

20

Service courses are logistically the easiest in terms of student scheduling and are the optimal solution if successful since a service course is being taken during the normal sequencing of a students’ chemistry progression. Bridging courses result in placing students out of the normal course sequence, creating logistical challenges for all of the subsequent chemistry courses and potentially other prerequisite courses. These courses also require much more buy-in from other stakeholders on campus, since waiting a semester or two directly impacts the degree progress of many other majors. For example, in order to have the bridging course at HPU approved by the university curriculum committee, multiple department chairs, deans of three separate colleges, admissions, student life, freshman advising, a retention committee, and the provost all needed to approve the proposal. One potential unintended consequence of either intervention is a student identified as at-risk may cause them to self-identify as “not ready”, or “less able” and can result in stereotype threat (45–47). This is most pronounced in the separate bridging courses because it is most obvious which students are not taking the normal course progression, and is particularly of concern for URM students. Careful messaging about the purpose of the course is essential. Once a transitional course is devised, how students are placed into these courses can vary significantly. It was already mentioned that some version of SAT, high school GPA, placement test, or a multivariate approach has been reported as a means of identifying at-risk students. Once students are identified, preparatory courses are either mandatory or optional (self-selected), which is presumably to avoid stereotype threat. A wide range of anecdotal and evidence-based interventions have been utilized in designing transitional courses, complicating direct comparisons between approaches. A non-exhaustive list includes peer-led team learning (PLTL) (27, 48–50), processoriented guided-inquiry learning (POGIL) (51–53), student-centered activities for large enrollment undergraduate programs (SCALE-UP) (54–56), studio classes (32, 57–60), supplemental instruction (SI) (61, 62), team-based learning (TBL) (63), flipped classrooms (64, 65), problem solving (4, 12, 19, 26, 33, 54, 66, 67), study habits (68, 69), class size (70), modular approaches (71), cooperative learning (72), deliberate practice (15), metacognitive approaches such as growth mindset (16) and grit (17), and Treisman-style workshops modeled after a highly successful intensive mathematics service course (10, 11, 73, 74). Finally, when evaluating the literature of preparatory courses, it should be noted that many of the studies are severely underpowered from a statistical standpoint due to the fact that there are relatively small sample cohorts (75). Because of this, the reported improvements in student performance should only be considered preliminary until larger sample sizes are reported, and ideally, have been replicated at other institutions. Types of Preparatory Courses and Teaching Strategies As noted in Table 1, there are at least a dozen service courses reported in the literature dating back to the mid-1970s. Across these reports, there are no clear trends in terms of type of interventions, outcomes, mechanics of the course or even how students are identified to participate in the program. About half of the reported service courses have specifically identified at-risk students using one of the metrics described earlier although the described service courses are generally optional. Often, students participating in the service course are asked to sign a contract demonstrating their commitment to the program concerning attendance and participation in the service portion of the general chemistry course. A common contract is that students will not miss any more than two sessions of the service course or they will be removed from the course. A range of interventions are reported but the two most common types are a variation of peer-led teaching, or Treisman-type workshops (5, 12) which will be explained below. According to Lee’s meta-analysis, 21

a PLTL model at Washington University St. Louis (27, 50) and a POGIL model at the University of South Florida (76) have shown statistical significance, while the others were not determined to be statistically significant or were not included in the meta-analysis (5). The report from USF is particularly interesting because the authors report not only on the effectiveness of their intervention, meaning improvement on assessments including the ACS examination, but also on equity in which they have tried to determine if their intervention has unequally benefitted better prepared students compared to students identified as at-risk or URM students (76). The authors concluded that while student performance improved, “achieving equity remained elusive: the consequences of pre-existing achievement gaps did not lessen as a result of reform implementation.” It should be noted that there have been more service courses reported in the literature in the past decade compared to bridging courses, and it is not clear if there is now a trend toward offering service rather than bridging courses for atrisk students. There have been at least ten bridging courses reported in the literature also dating back to the 1970s. These courses are offered prior to enrollment in the standard first semester general chemistry course. By definition, these are more disruptive to a student’s course of study since they involve an additional semester prior to starting their chemistry sequence, and they do not count toward any degree program. Some of these courses have been designed to count toward a general education science requirement, and the majority do not have a laboratory component. Both service and bridging courses are designed to improve the outcomes of at-risk students. However, bridging courses have at-risk students as their primary audience. Placement exams (4, 25, 66, 77–79), SATM (2, 33), self-identifcation (80) the grade on the first exam in general chemistry (26), and self-paced online modules (81) have all been employed to identify students for enrollment in bridging courses. Two bridging courses were identified by Lee’s meta-analysis as showing statistical significance (5). The first was reported in 2005 by researchers at Texas Tech University (4, 25). They developed a 20 question multiple choice placement exam consisting of general science, chemistry, and basic mathematics content modeled after the ACS Toledo Examination. All students planning to enroll in their traditional general chemistry course (Chem 1307) were required to take the placement exam during the summer prior to enrollment. Students who scored above a 50% were permitted to enroll directly in Chem 1307. All other students were required to take their bridging course (Chem 1301). Fully 75% of incoming freshman were required to enroll in Chem 1301. In their six year evaluation, they found that the rate of progress through Chem 1307 was actually lower after introducing the placement exam because between 50 - 60% of the students were stopping after Chem 1301 and never progressing to Chem 1307. However, according to Lee, the students who did persist to Chem 1307 improved their academic performance (5). The second statistically significant study is by researchers at Duke University called the “Science Advancement through Group Engagement” (SAGE) program (2). SAGE is an overarching model using PLTL designed to work with at-risk first and second year students through four semesters of chemistry courses (two general chemistry and two organic chemistry). The program offers both a bridging and a service component depending on SATM placement. While the paper is not specifically about the design of their bridging course, for entering freshman in the bottom quartile in SATM, a bridging course was developed using the overarching educational philosophy of social constructivism. The authors report that the SAGE program has more than doubled the retention of students through the general and organic chemistry sequence compared to their historical and contemporary control groups (2). A particularly promising approach adopted by several transitional chemistry courses was originally developed for the improvement of student performance in introductory calculus classes. 22

This was reported by Uri Treisman and colleagues in a project started in the 1970s called the Emerging Scholars Program (ESP) (10, 11, 73, 74). The idea was originally developed in order to address what Hsu termed “a practical crisis at the University of California at Berkeley of overwhelming failure at the calculus level of black and Hispanic students (10).” This model was adopted and further refined over the years at the University of Texas at Austin (82), the University of Wisconsin at Madison (83, 84) and the University of Kentucky (85) and has spread to other locations over the subsequent decades (86). First semester calculus was identified as a gateway course for the same reasons described for general chemistry because it presents a significant barrier for students pursuing STEM degrees as well as students interested in the medical professions. In the original reports, it was observed that Calculus I had a high failure rate for “high-achieving, high potential” minority students (73). As the program has evolved, it has been reported that URM students significantly benefit from the ESP workshops, but achievement gains are observed for all students who participate (10). The overarching philosophy of ESP workshops is distinguishing between excellence and nonfailure (87). The authors point out that many preparatory programs have a primary goal of preventing students from failing. In the ESP model, the goal is the development of excellence in mathematical reasoning. The way that they accomplish this is through a workshop model generally taught by graduate teaching assistants (GTA). Historically, Calculus I had been taught at research universities as a three hour lecture per week with a one hour per week recitation taught by a GTA in which the GTA worked problems at the board. Treisman decided to replace the recitation session with longer problem solving sessions he called “workshops”. Compared to traditional recitation sections, the workshops typically have less students (12 - 20 instead of 25 - 30), meet for longer blocks of time (75 - 120 minutes instead of 50 minutes), and more often (2 - 3 times per week instead of 1 - 2) (10). During the workshops, students work in groups on carefully designed worksheets that are purposely challenging drawing concepts and material across several chapters of the book and forcing students to work collaboratively and teaching them to persevere. Students are not working on homework and the GTAs are specifically directed to not answer questions but only to keep discussions moving forward. Because of the promising results reported by Treisman over several decades, there have been several reports in the chemistry education literature about Treisman-type workshops designed for general chemistry instruction (5, 12, 88, 89). While not directly related to Treisman’s approach for calculus instruction, there is an emerging field of psychology research on the topic of expert performance pioneered by the work of Anders Ericsson at Florida State University which has notable similarities to Treisman’s workshops. Ericsson argues that the primary means of developing expert performance in a given field is through what he has termed “deliberate practice” which he first observed by studying violin students at the Berlin University of the Arts (14) and has since observed in many other seemingly unrelated fields (15). The major result of the initial study was that students who reached the highest levels of performance in violin had practiced for several thousand hours over nearly two decades, and that truly effective practice had certain defining characteristics (14). The characteristics of deliberate practice are: (1.) it requires a field in which training techniques are already well-developed; (2.) it requires an expert teacher or coach who can provide activities to improve performance and provide immediate feedback to the learner; (3.) it takes place outside of one’s comfort zone and requires students to push just beyond their current abilities (related to Vygotsky’s Zone of Proximal Development (ZPD) (90)); (4.) it begins with specific, well-defined goals that have been set to improve performance in a specific task; (5) it requires a learners’ full-attention and concentration; (6.) it requires feedback and modification of efforts in response to the feedback; (7.) it produces effective mental representations; 23

and (8.) it nearly always involves building or modifying previously developed skills. Ericsson argues that training programs that have these overarching characteristics are needed to develop continually improving performance in any field (15). He acknowledges that many fields at this point cannot truly implement deliberate practice as described because those fields do not have well-established protocols for producing expertise. Specifically, there is no consensus on the optimal method of teaching general chemistry like there is with such disciplines as ballet dancing and playing violin (15), but using Ericsson’s principles, science educators have recently begun exploring whether deliberate practice ideas can also be used to improve academic performance in STEM fields (91, 92). In fact, Ericsson’s own work suggests that simply “studying longer” or “studying harder” does not produce improved academic performance (93). It is only when study habits involve principles of deliberate practice that there is a correlation between studying and improved GPA. Evaluating Treisman’s ESP workshops through the lens of Ericsson’s deliberate practice, it is apparent that both models have considerable conceptual overlap even though they use different terminology. Using deliberate practice as the overarching theoretical framework, it is clear that the workshop model employs many of Ericsson’s essential characteristic for improving performance such as an expert providing appropriate training and immediate feedback (challenging worksheets with trained GTAs monitoring progress), focused and concentrated effort during the planned workshop times, pushing students out of their comfort zones through problems that are challenging enough to teach students to develop new mental representations and to persevere, goal setting and building on previously developed skills. All of these key requirements are evident in the challenging, multichapter problems required of the ESP approach. In the remainder of this chapter we will discuss how we have used deliberate practice as the theoretical model of a bridging chemistry course developed at HPU, but the reader will note that many of the key ESP workshop ideas developed for calculus instruction are present as well.

High Point University’s General Chemistry Background University and Student Profile High Point University (HPU) is a small, private, liberal arts university located in the Piedmont Triad region of North Carolina having an undergraduate population of 4500 students. Students enrolled in CHM 1010: General Chemistry I (henceforth called GC-I) fit the profile identified in the introduction of primarily being interested in some type of pre-professional medical or allied health field. The most common majors among general chemistry students at HPU are biology, neuroscience, exercise science, and pre-pharmacy. It should be noted that there are almost no chemistry or biochemistry majors enrolled in GC-I because there is a separate section of general chemistry for majors. Other degrees which might be expected in general chemistry such as physics, engineering, geology, nursing and dietetics either do not exist at HPU or do not require general chemistry for their degree programs. The 25th - 75th percentile range for the composite SAT (verbal + math) for enrolling students at HPU in 2017 was 1070 - 1260 which is the 48th - 81st percentile nationally. While there is no formal definition of selectivity for college admissions, acceptance rate is often used as a proxy for selectivity. Adopting the definitions used by Jon Boeckenstedt’s blog (94), selectivity can be divided into five categories; most selective (0 - 12.5% acceptance rate), highly selective (12.5 - 25%), very selective (25 - 50%), somewhat selective (50 - 75%) and not selective (75 - 100%). According to these definitions, HPU would fall under the lower end of the most common category of somewhat selective. In 2012, Somewhat Selective and Not Selective schools enrolled nearly 78% of the undergraduates in the U.S. compared to only 4.4% of the nation’s undergraduates 24

being enrolled in the Most Selective and Highly Selective categories (i.e. the top quartile). The complete data set can be found in the U.S. Department of Education’s Integrated Postsecondary Education Data System (IPEDS) database for all public and private schools in the doctoral/research, masters, baccalaureate, and business/engineering Carnegie Classifications (95). General Chemistry at HPU The two semester general chemistry sequence at HPU is a traditional three credit lecture plus one credit (120 minutes per week) laboratory sequence, and the topics discussed are standard in most textbooks and on the ACS examination for the first semester and full year. We have adopted Tro’s Chemistry: A Molecular Approach (96) and make extensive use of Pearson’s Mastering Chemistry online homework and Dynamic Study Modules in all sections of the course. Annual enrollment is approximately 250 students, and has been rising over the past fifteen years proportionate with the overall rise in enrollment at the university from 1400 to 4500 undergraduate students. Class sections are generally capped at an enrollment of 45 students so there are typically 5 - 7 sections per semester taught by different full-time faculty. Performance on the ACS general chemistry exam has risen significantly over the past six years from a baseline of the 33rd %-tile in the Fall 2013 on the 2009 ACS First Semester General Chemistry examination to an average of the 65th %-tile from 2014 – 2018 on the same instrument. The ACS final exam is typically worth 25 – 30% of the final course grade. Hour exams are generally worth 45 – 50% with the remainder of the grade based on online homework, Dynamic Study Modules and participation. There are a variety of reasons for the performance increase on the ACS exam over the past halfdecade, but the majority of the improvements can be attributed to all sections implementing more active learning in lecture, daily pre- and post-classroom reading and homework assignments, and the creation of a strongly encouraged drop-in “Learning Laboratory” which meets for a total of 20 hours per week and an on-line virtual Learning Laboratory through Piazza (97). The Learning Laboratories and Piazza are staffed by faculty and upper-division undergraduate students. Additionally, weekly quizzes, SI, adaptive homework problems and metacognitive messaging have been used in all sections (98). While we have not made an attempt through a controlled scientific study to identify which specific evidence-based practice has resulted in the increases in performance on the ACS exams, it is clear that the combination of active learning and metacognitive strategies has transformed the classroom culture and has resulted in significant gains in student performance in general chemistry at HPU. These gains have also persisted into organic and biochemistry, both of which have seen similar improvements in performance on standardized assessments such as ACS exams and on the physical science sections of the medical college admissions tests (MCAT) (unpublished data). While the increased rigor and performance has obviously benefitted a large number of students enrolled in our general chemistry sequence, it has more clearly revealed a bifurcated distribution of student performance observed by many other chemistry educators. The much discussed at-risk students identified in the literature have become more apparent in our classes. As a result, the DFW rates have increased significantly from an average of only 9% prior to 2013, to a running average of 31.5% from 2013 - 2018. Thus, major performance gains made possible through more focused active learning strategies have only amplified an issue that was more than likely lurking under the surface prior to Fall 2013, but was revealed as the rigor and expectations in the course increased.

25

Problem Solving in Chemistry Bridging Course for At-Risk Students To help assist students most likely to receive a D, F, or W, we created a one semester, four credit hour, bridging chemistry course called CHM 1008: Problem Solving in Chemistry. As the name implies, helping students to develop strategies for solving algorithmic problems is the primary focus of the course. We will refer to this course subsequently as PS-C. This goal was to be accomplished in an active learning environment and by incorporating best-practices in metacognitive strategies from the educational and expert performance psychology literature such as “growth mindset” from Carol Dweck (16), “grit” from Angela Duckworth (17), and the aforementioned deliberate practice (15). Other metacognitive strategies included using cognitive exam wrappers, introducing students to Bloom’s taxonomy and the study cycle in the first lecture after the first exam, and concept mapping in the lab (98). The same textbook and Mastering Chemistry were used as in GC-I, but the semester was only designed to cover the first four chapters of the book with the major topics of dimensional analysis, structure of the atom, naming and writing compounds especially ionic compounds, the mole concept, stoichiometry and aqueous reactions. Based on experience and recently confirmed in the literature (24), students who struggle with the mole concept and stoichiometry tend to do poorly in general chemistry. Focusing on how to solve dimensional analysis problems in general and stoichiometry problems specifically should be beneficial to any student who continues into the regular general chemistry sequence. The grading structure consisted of five hourly exams (rather than three or four in the regular GC-I course) worth 45% of the course grade plus a non-standardized comprehensive final exam worth 20%. The remainder of the grade was based on homework, class prep and the laboratory. Importantly, Pearson’s Mastering Chemistry was implemented for online homework, adaptive follow-up assignments, Dynamic Study Modules, and in-class and pre-class participation. The course counted toward a general education science credit upon completion for those who elected not to continue onto other science courses, but did not count toward the fulfillment of any major requirements. Enrollment was limited to freshman students with a few exceptions who were not considered for the current analysis. The course in Fall 2017 was originally designed with three fifty minute lectures and 180 minutes per week of a wet/dry lab which alternated every week. In the dry lab, students were taught specific problem solving strategies with worksheets including problem sets with progressively more difficult word problems on a specific topic, Dynamic Study Modules, or specific Mastering Chemistry problem sets designed as review problems prior to exams. In the wet lab, the purpose was not to develop particular laboratory techniques, but semi-quantitative experiments were carefully chosen to illustrate concepts such as simple measurements, the mole, molarity, making solutions, density measurements, and precipitation reactions. There was no emphasis placed on lab report writing or error analysis, but strictly on reinforcing concepts taught in the lecture class through hands-on experiences. For the Spring 2018 iteration of the course, it was decided that 180 minute labs was more time than needed to complete the wet labs, and that it was difficult for students to stay focused for three hours in the dry lab, so both the wet and the dry lab were reduced to 120 minutes weekly. Metrics for Identifying At-Risk Students and Placement in Bridging Course In order to identify at-risk students from a historical control group, an analysis was performed with final grade outcomes of all students who enrolled in GC-I from Fall 2014 - Fall 2016 (N = 623) by comparing typical metrics such as SATM, and high school GPA with final course GPA. In addition, a secondary analysis was made using a multivariate metric used by the HPU Office of Admissions called “Admissions Index” (AI). AI is a proprietary weighted combination of 26

“superscored” standardized testing score (in other words the highest attempted score for each section of the SAT or ACT exam), high school GPA, rigor of academic high school academic program and overall rigor of high school as determined by the high school guidance counselor and the admissions office, respectively. The algorithm is proprietary, but such an internal metric is widely employed by many university admissions offices (34). While the exact formula is opaque, HPUs index weights long-term academic measures such as high school GPA, class rank and rigor of high school more heavily than standardized exam scores. Figure 1 shows a nearly Gaussian distribution of AIs for all students enrolled in GC-I from 2014 - 2016 having a mean AI = 78.0. The average AI for the entire student body at HPU during the same time period was 75.2 and was nearly identical over the three years studied. Figure 2 and Figure 3 show final course GPA in GC-I from 2014 – 2016 compared to both SATM and AI, respectively. While SATM score has a moderate correlation (R2 = 0.203) as has been reported in the literature as a predictor of success in general chemistry, AI has a stronger correlation (R2 = 0.356). These R2 values are both statistically significant at p < 0.001.

Figure 1. AI histogram of students enrolled in GC-I Fall 2014 - Fall 2016. Mean AI = 78.0, N = 623. Another way of plotting the AI data is the percentage of students who achieve the requisite Cscore in order to progress to General Chemistry II called the progression rate. Figure 4 shows a bubble plot of the progression rate as a function of AI. The AI data is binned every 2.50 points in AI (eg. AI = 80.00 - 82.49), and the size and color of the circle for each data point is proportional to the number of students within that binning range. One can clearly see two linear regions with a change in slope at approximately AI = 80.0. Students with AI > 80 have 90 - 100% chance of progressing into General Chemistry II, but those with AI < 80 begin to see their progression rate dropping linearly. Based on Figure 3, even those that do progress typically have a much lower average GPA. It was for these at-risk students with an AI < 78 that the bridging chemistry course (PS-C) was designed. Using the data from Figures 2, 3 and 4, a placement matrix for GC-I, PS-C, Calculus I and precalculus (Pre-Calc) was devised to be used by freshman advisors as shown in Figure 5. All students who were placed into a PS-C course, were not able to enroll in GC-I until after the completion of the PS-C course either during the 2018 summer or fall term. Based on this decision matrix, students enrolled in PS-C in the Fall Semester 2017 were closer in academic profile to the students enrolled directly in GC-I, and students in PS-C in Spring Semester 2018 had weaker academic profiles and were considered more at-risk in terms of chemistry progression. 27

Figure 2. Final GC-I grade as a function of SATM, Fall 2014 - Fall 2016. Mean SATM = 573.6,

R2=0.203, N = 530 (Data does not include W grades). Linear regression and variance are shown. Vertical line is located at SATM = 590 used for advising in PS-C. Horizontal line is located between C- and D+ grade showing progression to General Chemistry II.

Figure 3. Final GC-I grade as a function of AI, Fall 2014 - Fall 2016. (R2 = 0.356, N=623). Linear regression and variance are shown. Vertical line is located at AI = 78 used for advising into PS-C. Horizontal line is located between C- and D+ grade showing progression to General Chemistry II. W grades have been arbitrarily assigned GPA = 0.1 and shown as open circles.

28

Figure 4. GC-I progression rate from Fall 2014 - Fall 2016. AI is binned every 2.5. The size of each circle and color shading is proportional to the number of students in that binning range. (N = 623).

Figure 5. General chemistry placement decision matrix for freshman advisors. Teaching Strategies Used in Bridging Course and Approaches to Measuring the Impact of Those Strategies In addition to assessing the course performance of our students (the cognitive domain of learning), we measured a number of variables belonging to the affective domain of learning. HPUs current Quality Enhancement Plan (QEP) centers on Growth Mindset. This theory was pioneered by Carol Dweck and describes a person’s mindset as existing on a continuum between a fixed mindset and a growth mindset (16). If you have a fixed mindset, you believe your abilities and intelligence are predetermined, but if you have a growth mindset, you believe these can be developed. This measure 29

was particularly well-suited for PS-C, as we were concerned that students who were identified as atrisk may have shifted toward a fixed mindset, while they still had the opportunity to strengthen their chemistry and mathematical knowledge throughout the curriculum. Instructors imparted consistent, verbal messaging that improving performance is possible only if you first reject the idea of a fixed mindset. As these students were all in their first year, we emphasized that studying and practicing in the same manner that worked at the high school level would most likely not allow them to reach their long-term goals. Additionally, just practicing a technique for a certain number of hours will not guarantee success (93). Rather, deliberate practice is what matters. Before receiving a graded inclass exam, each student completed a reflective exam wrapper that asked them to report how they approached studying for that exam, as well as the grade they expected to earn. Once the instructor provided feedback and their grade, students again reflected on which concepts gave them the most difficulty. Importantly, they were asked to generate specific adjustments they would take to improve their performance for the next exam. We aimed to enhance growth mindsets in our new course to help promote higher levels of achievement in chemistry. Students responded both at the beginning and end of the semester to a validated, growth mindset instrument (99). Additionally, we spoke with students in PS-C about the importance of grit, a personality trait that involves perseverance even when things are difficult (17). Some of our students were entering college and being advised to not enroll in a traditional first-semester chemistry course. We wanted to impart messaging that included encouragement that they could work hard in order to be ultimately successful in chemistry and reach their long-term goals. At the same time, we felt it was important to emphasize that the practice this would entail might be difficult or feel “uncomfortable” at times. Students responded both at the beginning and the end of the semester to Duckworth’s Short Grit Scale which consists of eight statements on a 1-5 Likert scale (100). As described in the introduction, deliberate practice was infused throughout the lecture and laboratory sections of the PS-C course in a number of ways. In the dry laboratory, instructors lead a discussion on select readings from Ericsson’s book, “Peak: Secrets From the New Science of Expertise (15).” The goal of this discussion was to provide concrete examples of novices improving over time at a particular skill. Importantly, we discussed the practice strategies that went into these achievements. We also asked students to share examples from their own lives. One of the hallmarks of deliberate practice is repetition. In line with this, students in the PS-C course were meeting four days per week between the lecture and laboratory times. Mastering Chemistry assignments were due at the start of every lecture, including both reading quizzes on the upcoming material and more in-depth problem sets on material that was already discussed in class. The problem sets also included optional Adaptive Follow-Up assignments due 2-3 days later. These were tailored to each student’s current misunderstanding of material. Students were exempt from these if they scored 80% or higher on the associated online homework assignment. Additionally, repetition was achieved through the Dynamic Study Modules employed during and after the dry labs. These serve as virtual flashcards that are adaptive to student responses, but also repetitive. This level of practice and feedback was designed to develop at-risk students’ mental representations in chemistry. Finally, we aimed to measure students’ self-concepts, which can be described as the degree to which students perceive themselves to be strong in a particular area. Given the objectives of this study, we were most interested in students’ self-concept in chemistry and mathematics before and after completing the PS-C course. We also surveyed students in the traditional GC-I course using the validated Chemistry Self-Concept Inventory (101). The university’s Institutional Review Board (IRB) approved this protocol (#201707-617) under an exempt review.

30

Research Questions 1. Does a bridging chemistry course that includes deliberate practice and metacognitive strategies impact the academic performance of at-risk students in a subsequent general chemistry course? 2. Does a bridging chemistry course designed with deliberate practice and metacognitive strategies impact the mindsets and self-concepts of students enrolled in introductory chemistry?

Data Collection and Analysis Data was collected and analyzed for both research questions. For Research Question #1, historical control data was obtained though a records query by the Office of Institutional Research (OIR) at HPU. The academic and admissions records of all students who had enrolled in GC-I from 2014 – 2016 were collected. Data comprised students’ highest SAT (verbal and math) and/or ACT (reading and math) scores, high school GPA, AI, and final letter grade in GC-I. Data from students who were placed into the PS-C course in the Fall Semester 2017 or Spring Semester 2018 was also obtained from a OIR request which included name, major, minor, cumulative GPA, PS-C final letter grade, highest SAT (V/M) or ACT (R/M), AI, and enrollment status at HPU. Faculty teaching the five sections of GC-I were not informed which students had enrolled in PS-C the prior academic year unless the student self-revealed. At the conclusion of the Fall 2018 semester, faculty teaching GCI submitted the names, ACS raw and percentile scores for the 2009 First Term General Chemistry exam, and final letter grade for all students enrolled in GC-I in an Excel format to the authors. We then identified all students who had completed the PS-C course and separated the students into cohorts termed “GC-I Only” (students who did not take the PS-C course first) and “PS-C/GC-I” (students who took PS-C prior to GC-I). Four students were placed into the spring PS-C course who had not successfully completed GC-I the prior semester. Data from these students was omitted from this analysis. For Research Question #2 we collected survey responses which included the Growth Mindset Instrument (99), Short Grit Scale (100), and the Chemistry Self-Concept Inventory (101). If students consented to the study, their survey responses were collected anonymously in Qualtrics (www.Qualtrics.com), during the first week of the semester (pre) and last week of the semester (post). At the completion of each survey, students followed a link to a second survey where names were entered to confirm completion. The names were not tied to individual responses. This allowed the instructors to incentivize students with a guaranteed bonus point on the final exam for completion of both pre and post surveys. Students who chose not to participate in the study were given an alternative, equally weighted (1 bonus point) question on the final exam. The scoring function within Qualtrics generated each student’s Growth Mindset Score, which ranged from 0 (most fixed) to 6 (most growth-minded) (99). Grit scores were similarly generated but ranged from 0 (least gritty) to 5 (most gritty) (100). Findings from the chemistry self-concept inventory were analyzed as described previously (101). Microsoft Excel was used to analyze all survey results. Unpaired student’s t tests were performed to calculate p-values for survey responses between groups. After collection, data was imported from Excel and analyzed with standard statistical measures built into the R statistical programming language. All figures except the Sankey diagram (Figure 6) were generated using the open-source visualization package ggplot2 in R. Figure 6 was generated using sankeyMATIC.com, which is an open-source program build from d3 and its Sankey library. 31

Results and Discussion Research Question #1 Our first research question was, “Does a bridging chemistry course that includes deliberate practice and metacognitive strategies impact the academic performance of at-risk students in a subsequent general chemistry course?” To help answer this question, we analyzed both enrollment and performance data in these courses. Table 2 shows enrollment (SATM and AI) and performance data (midterm and final GC-I course GPA and ACS percentile after completing GC-I) in GC-I and PS-C for the Fall 2017, Spring 2018 and Fall 2018 semesters. Enrollment and performance data from students enrolled in GC-I from Fall 2014 - Fall 2016 were also examined as a historical control before the PS-C course intervention was created and is shown as a shaded row. There was a total enrollment of 48 students in the two semesters of PS-C, but for this analysis, we have removed four students who had already enrolled in GC-I in a prior semester and were placed in the course to improve their problem-solving skills. Thus, only 44 students were considered in PS-C in Fall 2017/Spring 2018, and a cohort of 21 students eventually enrolled in GC-I which limits the sample size for this study. Figure 6 shows a Sankey diagram of the outcomes of PS-C students and their progress into the GC-I course. A Sankey diagram is useful for visually depicting the “flow” from one event to another. The first observations concern results specifically from PS-C. Only two of the PS-C students (4.2%) withdrew from the class, thus nearly 96% of the students completed the course and received their general education science credit. By the end of Spring 2018, 18.2% of the PS-C students had transferred from HPU which is slightly higher than the university freshman retention rate (~16%). An additional 29.5% of students changed majors and elected not to continue onto GC-I. The remaining 47.7% of students advanced to GC-I either in the Summer or Fall Semester 2018. All but one enrolled in GC-I at HPU. The other took GC-I at another institution over the summer. In what might be considered the most successful aspect to the creation of PS-C, almost all students were able to complete the course successfully. The vast majority of students (88.6%) received a grade in the A - C range. Like many universities, HPU is seriously concerned about student retention. A major point of emphasis for departments is to develop strategies to reduce the DFW rate since those grades are often indicative of potential retention issues. The departmental GPA for GC-I in 2017 (the year without any of the PS-C students enrolled) was higher (2.63 compared to a historical average of 2.46 for 2014 - 2016), but the performance on the ACS exam was only slightly improved and certainly within the standard error raising the question as to whether there was an actual improved performance in the GC-I not. Faculty in all sections of GC-I reported that they observed far less students who were unengaged and who were clearly struggling from the very first days of the semester. It could simply be that removing 25-30% of the most at-risk students would result in the same improved classroom dynamics. In 2017, the DFW rate dropped slightly in GC-I from a running average of nearly 30% to 24% as shown in Table 3. The withdraw rate was also slightly lower, but the distribution of students who withdrew may have shifted with the introduction of the PS-C course.

32

Table 2. Student performance in PS-C by semester and progression to GC-I. Numbers in parentheses are standard deviations

33

N

SATM [est. %]

AI

* PS-C GPA

PS-C %DFW % Major Change % Transfer

N

GC-I 2014-2016 Control

623

78.0 (7.4)

N/A

--

623

N/A

PS-C Fall 2017

13

76.5 (5.1)

* 2.52 (0.91)

15.4 38.5 15.4

6

PS-C Spring 2018

31

76.8 (5.0)

* 2.46 (0.94)

12.9 25.8 19.4

Combined PS-C (F+S) GC-I Only Fall 2018

44

566 (73.0) [62nd] 543 (54.0) [54th] 524 (49.0) [47th] 530 (85.9) [49th] 612 (84.0) [78th] -82

76.7 (5.0)

* 2.47 (0.99)

84.5 (6.6) -7.8

Difference

176

GC-I Midterm GPA GC-I Final GPA

GC-I % DFW

ACS Exam %tile

2.46

25.7

63.0

3.42 (0.38)

2.55 (0.81)

0.0

60.0

15

1.99 (1.19)

1.91 (0.85)

16.1

54.4

13.6 29.5 18.2

21

2.36 (1.22)

2.09 (0.87)

11.4

56.0

--

--

176

2.49 (1.26)

2.53 (1.04)

32.4

69.5

--

--

-0.13

-0.44

-21.0

-13.5

Note: Separate PS-C rows include student data for PS-C course only (Fall Semester 2017, Spring Semester 2018 and Combined) denoted with a *, and PS-C/GC-I in four right columns. Historical GC-I control data is shaded.

Figure 6. Sankey diagram of PS-C student outcomes from students enrolled Fall 2017 and Spring 2018 (left) to Fall 2018 GC-I (right). Historically 63% of the students who have withdrawn from GC-I before completing the term have been first year students and 37% have been second year and higher. This is not surprising considering that first year students would be expected to have the biggest adjustment to college and they may not have developed the study habits needed to make the transition from high school to college to succeed in a rigorous course such as general chemistry in their first semester in college. In addition, by the second year or later in school, presumably students are not enrolled in GC-I unless they are certain that they need to complete that class for their graduation requirements as there are other general education science courses that are less demanding. In the Fall 2017 GC-I class, the percentage of first year students who withdrew dropped to only 48% compared to the historical average of 63% from 2014 - 16. It is possible that the creation of PS-C may have helped to lower the number of first year students who withdraw from GC-I which is beneficial for student retention at the university. Table 3. Withdraw statistics for Fall 2017 General Chemistry I Historical Average GC-I (2014-2016)

GC-I Fall 2017

Δ

% DFW

30.3 (N = 236)

23.8 (N = 43)

-6.5

%W

14.3 (N = 110)

11.9 (N = 23)

-2.4

% First Year W

63.0

47.8

-15.2

% Upper Class W

37.0

52.2

+15.2

Figure 7 shows the percentage of grades assigned to students who enrolled in GC-I Only compared to those who enrolled in PS-C followed by GC-I in the Fall Semester 2018. Table 2 also reports the GPA and final ACS percentile scores of the GC-I Only compared with the PS-C/GC-I cohorts. The table separates the PS-C/GC-I by semester offered and an aggregate of both semesters combined. Students who enrolled in GC-I after PS-C had nearly a half a letter grade lower GPA compared to the GC-I Only students (Δ = -0.44). More worrisome was the drop in grades from the midterm to the final where the GPA dropped from a 2.36 to a 2.09 for the combined PS-C/GC34

I cohort. Looking closer, the contrast was in midterm performance of the Fall 2017 PS-C cohort (albeit with only 6 students). They saw nearly a letter grade GPA drop from midterm (3.42) to the final (2.55), while the Spring 2018 cohort did not show a measurable change (1.99 to 1.91).

Figure 7. Fall 2018 percent grade distribution of students enrolled in GC-I Only compared to PS-C followed by GC-I. The transition of material covered in PS-C to new material (beyond Chapter 4) occurs about two weeks prior to midterm grades, so everyone who had completed PS-C would have already covered the majority of material from the same textbook using the same online homework and in many cases, with the same instructors and examination styles prior to the midterm. The performance of the PS-C/GC-I students at midterm was within the statistical margin of error of the students who had enrolled in GC-I Only (GPA = 2.36 vs. 2.49), but between midterm and the final exam, over a third (38%) of the PS-C students saw their grades drop by a letter grade or more and none improved by more than a half letter grade. This suggests that students did not successfully make the transition to the new material despite the prior semester of training in problem solving and metacognitive strategies. A possible explanation is that the problem solving and metacognitive strategies employed in PS-C were not continued in GC-I and any beneficial effects would begin to wane. Figure 8 shows the performance of students on the ACS exam for students enrolled in GC-I Only and PS-C/GC-I. While 27.7% of students who enrolled in GC-I Only scored at or above the 90th %-tile on the exam, the highest grade of the PS-C/GC-I cohort was 87th %-tile. A final issue of concern is the performance of URM students in the PS-C progressing into GC-I. We aimed to design a course using the principles that had been reported to be effective for URM students in order to improve their performance in GC-I (12, 27). PS-C was designed to be an encouraging environment where students could gain confidence in their ability to successfully solve challenging chemistry problems. Anecdotally, the authors have observed for several years the struggles of URM students in our introductory classes. Stuck in a negative feedback loop, they are overwhelmed from the earliest days of the class and then withdraw or fail to get the passing grade at a disproportionate rate compared to their majority peers. It should be noted that like many suburban private liberal arts colleges, HPU is not a particularly diverse campus (18.4% minority with only 4.7% African American). Yet, the URM enrollment in PS-C was disproportionate with 25% of the registered PS-C students African American. The spring semester had 29% African American students enrolled. Recall that all of the spring students had a fall semester math placement of pre35

calculus or below. It should also be noted that since HPU is a private university with a relatively small endowment, there are few students from severely economically disadvantaged households, suggesting that students enrolled in our classes are not generally from lower socioeconomic classes, but may be from underperforming primary and secondary schools.

Figure 8. Fall 2018 semester 2009 First Term General Chemistry ACS examination percentile score as function of final course GPA for GC-I Only compared to PS-C/GC-I. Linear regression shown for each data set. There is one final note about identifying at-risk students using AI. Figure 3 shows the historical distribution of admission indices for the final GC-I grade results. All students who withdrew without a final grade assigned were arbitrarily assigned a GPA = 0.1 and are shown with open circles. Lines have been annotated on Figure 3 at GPA = 1.5 to show the C- progression rate and at AI = 78 to show the AI recommendation from Figure 5. This suggests four unique quadrants of students. Quadrant I are the students who were predicted to succeed based on AI in GC-I and in fact do. A relatively high percentage of 45.6% or students are in this category. Quadrant III are students who are predicted to not succeed and, in fact, did not. Combining these two quadrants, using AI only with no intervention from advising to predict success would have correctly identified 65.3% of incoming students. Quadrants II and IV can be considered false predictions, and are more problematic from a placement standpoint. Most worrisome are the Quadrant II students who are predicted to not pass GC-I yet do. These represent a fairly high percentage of 28.6%. Quadrant IV are predicted to succeed and do not but only represent 6.1% of the students. Note that there is a relatively high rate of students who withdrew in Quadrant IV (20% of the total withdraws). In some sense, the predictive power of AI is not unreasonable given that over 65% of students could be correctly identified. Even specifically considering Quadrant II, which is the most troublesome region, a relatively small percentage of students ultimately earned a B or higher grade, and only 2 students out of 623 earned an A grade. In other words, most of them earned some form of a C, which typically is a warning sign for continuing success in the second semester of general chemistry and on into organic chemistry. It should be noted that AI was the secondary means of identifying at-risk students based on Figure 5. The primary means was SATM, and AI was used for the students in the range between SATM 590 - 620. That said, Figure 3 is a potential cautionary tale when trying to devise a quantitative means of identifying at-risk students. Despite the success, there are still nearly 35% of students who would not have been properly identified and nearly 30% of them have historically passed the 36

first semester of general chemistry. In a recent New York Times article about universities using data analytics to predict student graduation success, a program director at Ithaka S+R, Martin Kurzwiel, reminds readers that, “algorithm is not destiny” (102). He continues, “It is important that human judgement is never removed from the process and that there is always an opportunity for a student to appeal a pathway that is being plotted for them.” This is important advice for all educators seeking to identify at-risk students and improve the ultimate outcomes of our students. Research Question #2 Our second research question was, “Does a bridging chemistry course designed with deliberate practice and metacognitive strategies impact the mindsets and self-concepts of students enrolled in introductory chemistry?” As shown in Figure 9, the average growth mindset scores (which can range from 0 to 6) (99) were highest for students who were either enrolled in or completed the PS-C course. This difference held true at both the beginning and end of the semester. Although no differences were deemed statistically significant, it is interesting that students enrolled in PS-C reported more of a growth mindset than their peers in GC-I. This may be due to the direct messaging from instructors, as instructors of GC-I did not take the same messaging approach.

Figure 9. Growth Mindset instrument results from the beginning (pre) and end (post) of the semester. Average scores can range from 0 to 6. Error bars are standard error of the mean. All students across the university receive information on growth mindset, as it is the focus of HPU’s Quality Enhancement Plan. It should be noted that no cohort of students reported an enhancement in their growth mindset at the conclusion of the semester, consistent with what others using this instrument at HPU have observed (unpublished data). Elevated stress levels and student time constraints in the week leading up to final exams may contribute to this slight decrease in scores. Logistically, post-survey response timing necessitates that we administer this (and other surveys) immediately before final exams. For context, the entire HPU freshmen population entering the university in Fall 2017 reported an average growth mindset score of 4.27, while chemistry students reported scores of 4.46 - 4.83. A separate growth mindset study in our department also showed growth mindset scores higher than the general student population (unpublished data), indicating that students taking these courses (primarily biology majors) may have more of a growth mindset than their peers. Additional research will be needed to confirm this. 37

Duckworth’s Short Grit Scale was utilized to measure overall grit in our student populations. As shown in Figure 10, grit scores were similar between PS-C and GC-I students across the semester. This limited range of scores may indicate that grit was not enhanced by discussing it in the PS-C course, however, in light of a recent meta-analysis that questions the validity of the grit instrument, it is quite possible that the results are not meaningful (103). Finally, the self-concepts of our students were measured in chemistry, mathematics, overall academics, and academic enjoyment at the beginning and end of the semester. These percentages can be interpreted as the degree to which students perceive themselves to be strong in a particular area (subscale). A subset of results is shown in Figure 11. The largest positive shift in self-concept was in the academic subscale and was reported by students enrolled in the PS-C course. A smaller, yet positive shift was also reported in the chemistry subscale among these students, but not in math. This indicates that student perceptions in general academics and chemistry may have been enhanced, but not in math and not in a statistically significant way. It is important to note the range of scores we observed is smaller than in other reports using this same instrument (101). Academic enjoyment was the highest subscale for all students regardless of if they were at-risk. This self-concept remained unchanged at the end of the semester.

Figure 10. Grit survey results from the beginning (pre) and end (post) of the semester. Average scores can range from 0 to 5. Error bars are standard error of the mean. The Grand Challege of Gateway Courses A major conclusion of Lee’s meta-analysis suggests that very few reported preparatory courses of either the service or bridging type have statistically significant changes in student learning outcomes (5). It should be noted that none of them worsened student learning outcomes, but either because the studies were statistically underpowered, were not carefully controlled, or, as is the case of many of the older studies, relied on intuition and experience rather than evidence, there are few reported preparatory chemistry courses that can be held up as exemplars for others. It should be noted that Lee’s analysis was limited to courses developed at research universities only and does not analyze studies from primarily undergraduate institutions. The rightmost column in Table 1 summarizes the results of Lee’s meta-analysis for statistical significance of claimed results where applicable. According to Lee, the three most effective interventions were the SAGE project at Duke University (2), a PLTL intervention at Washington University St. Louis (50) and Lee’s own Treisman-style 38

workshops at Cornell (5). The Cornell and Washington Univeristy programs are both concurrent service courses, while the Duke SAGE model has both a bridging and service option depending on student SATM placement.

Figure 11. Self-concept instrument results from the beginning (pre) and end (post) of the semester. Average sub-scale scores can range from 0 to 100%. Assuming that a service course is the ideal model from both a pragmatic and philosophical perspective, and given how long chemical educators have credibly been able to identify at-risk students with a variety of metrics, a logical question is, why are there only a small handful of studies in the literature that appear to be successful and statistically significant, and why are these only from the most highly selective institutions? In other words, can the promising results reported by educators at the top-50 most selective institutions which enroll only 4.4% of all the students in the United States be successfully transferred to the vast majority of institutions such as High Point University, which serve over 75% of the undergraduate students and are much more representative of the population of university students as a whole? Are these principles and courses more broadly applicable to all institutions, many of which face much more significant headwinds compared to HPU in terms of socioeconomic and educational background of their enrolled students? These are the defining questions facing chemistry educators in gateway courses like general and organic chemistry. We propose the following tentative explanation to make sense of the disparate literature and the fact that the most successful interventions seem to only be reported from highly selective universities. First, one must recognize that some of these effects could simply be due to selection bias in that large research universities are more likely to have the resources and expertise in terms of faculty, GTAs, facilities and finances to carefully design and run controlled studies with specialized support laboratories, peer-leaders, etc. Second, these same institutions have thousands of students enrolled over a several year time-frame meaning that sufficient sampling can be obtained, thus resulting in adequately powered studies. Given those caveats, we begin by assuming that the BFLPE and IRD effects are real phenomena, and that being in the bottom quartile of students would certainly affect the academic self-concept of at-risk students within any given cohort of students. If true, it seems that identifying and solving the BFLPE/IRD effect of at-risk students at highly selective universities is a far more tractable problem than solving the same problem at the vast majority of less-selective institutions. Looking at the criteria used for identifying students as at-risk in the three most noteworthy preparatory chemistry studies is instructive. Students in the service course 39

at Washington University had a mean ACTM = 33 (98th %-tile) (27), Duke University used the bottom quartile for SATM (≤ 670, 89%-tile) (2), and the Cornell students could elect to participate in the workshops but were not specifically identified or required (5). It should be noted that the bottom quartile of all Cornell students have a SATM ≤ 700 (92nd %-tile). Educators at a more “average” institution would likely have less than 20% of their entire general chemistry population achieve such high standardized math testing scores (eg. see Figure 2 from HPU) and thus would actually be in the top quartile of their cohort. Thus, designing interventions for such high-potential students should be more straightforward and result in a much higher probability of success compared to students with the national median or lower SATM score. In fact, Hall, et al. addresses this very issue in terms of cognitive load theory (104) for the Duke SAGE students and observes that it is not surprising that without any intervention, these students would drop out of STEM in disproportionate numbers relative to what would be expected given their past academic achievements and due to affective concerns such as the BFLPE and mindset (2). A greater challenge for chemistry educators is determining what interventions can make a statistically significant difference in the outcomes of at-risk students at an institution with an average or even below average student profile. The dearth of replicable literature on significantly improving the outcomes of these students suggest that this remains a grand challenge for our discipline. There is one study (24) from the University of South Florida that hints at the tantalizing potential of about 10% of at-risk students who have outperformed their expected outcomes, whom the authors term “risers”. These students not only outperform their expected outcomes, but even outperform the overall general chemistry student population. In their study, the authors comment that it is not known why risers are able to perform better than expectations, but it is an important area of further research (24).

Conclusions, Limitations, and Future Work The persistently high attrition rate of students in gateway STEM courses represents a major challenge for undergraduate educators. Literature has consistently shown that students in the bottom quarter to third of any given cohort pose a significant risk of not progressing through a STEM major and not ultimately being retained at their university. This is of particular concern for URM and first-generation students and represents a grand challenge for the STEM disciplines. Whether this is due to the big-fish-little-pond effect, individual relative deprivation effect, mindset, grit or some other psychosocial phenomena is important to consider, but is not the most pressing concern for chemistry educators. The fact that this attrition problem has been reported for all types of institutions (large and small, public and private, highly selective and less selective) points to how ubiquitous a problem it is. While there has been a considerable number of reports on a variety of preparatory chemistry courses using different evidence-based practices over the past several decades, the number of studies that have reported statistically significant results, have adequate statistical power, and have been replicated approaches zero. The most promising studies have been reported at highly selective research universities. Unfortunately, at this point in time, it is premature to determine whether the interventions described can be more broadly applied to less selective institutions. We began implementing a range of evidence-based practices into our general chemistry curriculum five years ago at High Point University and have seen the fruits of our labor. We have observed significant improvement on external metrics such as ACS examinations and the physical science portion of the MCATs. While we lacked a control group on the effectiveness of each pedagogical method employed in our courses, the combination of peer learning, student instructors, 40

SI, deliberate practice, and growth mindset messaging, has resulted in dramatic improvements in student learning outcomes. However, those improvements have only amplified the problem for our at-risk students. This resulted in the creation of a one semester bridging chemistry course prior to enrollment in general chemistry. Overall, students participating in PS-C perform well while taking the course and engaging in metacognitive strategies. However, the strategies implemented in the bridging course and student performance do not seem to carry over to the subsequent general chemistry course. The results from this study are limited by sample size and the institution at which this course was implemented. The combination of deliberate practice and metacognitive interventions seems to have a positive effect in general, but it does not appear to specifically improve the performance of at-risk students when they enter the general chemistry population. The fall semester cohort had the most promising preliminary outcomes which may not be surprising since they are statistically closest to the general chemistry population. We are using results and findings from this study to modify our course design to more carefully utilize deliberate practice in Treismanstyle workshops while educating students on the affective domain of learning. We are also using feedback from this course to more specifically implement deliberate practice and metacognitive strategies in all sections of general chemistry in the future.

References 1.

Tai, R. H.; Sadler, P. M.; Loehr, J. F. Factors Influencing Success in Introductory College Chemistry. J. Res. Sci. Teach. 2005, 42 (9), 987–1012. 2. Hall, D. M.; Curtin-Soydan, A. J.; Canelas, D. A. the Science Advancement through Group Engagement Program: Leveling the Playing Field and Increasing Retention in Science. J. Chem. Educ. 2013, 91 (1), 37–47. 3. Lewis, S. E.; Lewis, J. E. Predicting At-Risk Students in General Chemistry: Comparing Formal Thought to a General Achievement Measure. Chem. Educ. Res. Pract. 2007, 8, 32–51. 4. Bentley, A. B.; Gellene, G. I. A Six-Year Study of the Effects of a Remedial Course in the Chemistry Curriculum. J. Chem. Educ. 2005, 82 (1), 125–130. 5. Lee, S.; Crane, B. R.; Ruttledge, T.; Guelce, D.; Yee, E. F.; Lenetsky, M.; Caffrey, M.; Johnsen, W.; Lin, A.; Lu, S.; Rodriguez, M.-A.; Wague, A.; Wu, K. Patching a Leak in an R1 University Gateway STEM Course. PLoS One 2018, 13 (9), e0202041. 6. Barr, D. A.; Matsui, J.; Wanat, S. F. In Health Sciences, G.-M. E., Chemistry Courses as the Turning Point for Premedical Students. Adv. Health Sci. 2010, 15 (1), 45–54. 7. Nocera, D. G.; Harrison, J. F. Enhanced Performance in Chemistry by Minorities at the University Level: A Comprehensive Program. J. Chem. Educ. 1996, 73 (12), 1131–1137. 8. Graham, M. J.; Frederick, J.; Byars-Winston, A.; Hunter, A.-B.; Handelsman, J. Increasing Persistence of College Students in STEM. Science 2013, 341, 1455–1456. 9. Cooper, M. M.; Stowe, R. L. Chemistry Education Research—From Personal Empiricism to Evidence, Theory, and Informed Practice. Chem. Rev. 2018, 118 (12), 6053–6087. 10. Hsu, E.; Murphy, T. J.; Treisman, U. Supporting High Achievement in Introductory Mathematics Courses: What We Have Learned from 30 Years of the Emerging Scholars Program. In Making the Connection: Research and Teaching in Undergraduate Mathemetics, Carlson, M. P.; Rasmussen, C., Eds.; Mathematical Association of America: Washington, DC, 2011; pp 205−220.

41

11. Treisman, U. Studying Students Studying Calculus: A Look at the Lives of Minority Mathematics Students in College. Coll. Math. J. 2018, 23 (5), 362–372. 12. Adams, G. M.; Lisy, J. M. The Chemistry Merit Program: Reaching, Teaching, and Retaining Students in the Chemical Sciences. J. Chem. Educ. 2007, 84 (4), 721–726. 13. Hrabowski, F. A. Beating the Odds: Preparing Minorities for Research Careers in the Chemical Sciences. In Minorities in the Chemical Workforce: Diversity Models That Work; National Academies Press: Washington, D.C., 2003; p 89. 14. Ericsson, K. A. K.; Ralf, Th.; Tesch-Romer, Clemens The Role of Deliberate Practice in the Acquisition of Expert Performance. Psychol. Rev. 1993, 100 (3), 363–406. 15. Ericsson, K. A. P., Robert Peak: Secrets from the New Science of Expertise; Houghton Mifflin Harcourt: Boston, MA, 2016. 16. Dweck, C. S. Mindset: The New Psychology of Success; Ballentine Books: New York, 2006. 17. Duckworth, A. Grit: The Power Of Passion And Perseverance; Scribner: New York, 2016. 18. Scofield, M. B. J. An Experiment in Predicting Performance in General Chemistry. J. Chem. Educ. 1927, 4, 1168–1175. 19. Pickering, M. Helping the High Risk Freshman Chemist. J. Chem. Educ. 1975, 52 (8), 512–514. 20. Andrews, M. H.; Andrews, L. First-Year Chemistry Grades and SAT Math Scores. J. Chem. Educ. 1979, 56, 231–232. 21. Spencer, H. E. Mathmatical SAT Test Scores and College Chemistry Grades. J. Chem. Educ. 1996, 73, 1150–1153. 22. Wagner, E. P.; Sasser, H.; DiBiase, W. J. Predicting Students at Risk in General Chemistry Using Pre-Semester Assessments and Demographic Information. J. Chem. Educ. 2002, 79, 749–755. 23. Ozsogomonyan, A.; Loftus, D. Predictors of General Chemistry Grades. J. Chem. Educ. 1979, 56 (3), 173–175. 24. Ralph, V. R.; Lewis, S. E. Chemistry Topics Posing Incommensurate Difficulty to Students with Low Math Aptitude Scores. Chem. Educ. Res. Pract. 2018, 19 (3), 867–884. 25. Jones, K. B.; Gellene, G. I. Understanding Attrition in an Introductory Chemistry Sequence Following Successful Completion of a Remedial Course. J. Chem. Educ. 2005, 82 (8), 1241–1245. 26. Mills, P.; Sweeney, W.; Bonner, S. M. Using the First Exam for Student Placement in Beginning Chemistry Courses. J. Chem. Educ. 2009, 86 (6), 738–743. 27. Shields, S. P.; Hogrebe, M. C.; Spees, W. M.; Handlin, L. B.; Noelken, G. P.; Riley, J. M.; Frey, R. F. A Transition Program for Underprepared Students in General Chemistry: Diagnosis, Implementation, and Evaluation. J. Chem. Educ. 2012, 89 (8), 995–1000. 28. Pederson, L. G. The Correlation of Partial and Total Scores of the SAT of the College Entrance Examination Board with Grades in Freshman Chemistry. Educ. Psychol. Meas. 1975, 35, 509–511. 29. Rixse, J. S.; Pickering, M. Freshman Chemistry as a Predictor of Future Academic Success. J. Chem. Educ. 1985, 62 (4), 313–315. 30. McFate, C.; Olmsted III, J. Assessing Student Preparation through Placement Tests. J. Chem. Educ. 1999, 76 (4), 562–565. 42

31. Richardson, M.; Abraham, C.; Bond, R. Psychological Correlates of University Students’ Academic Performance: A Systematic Review and Meta-Analysis. Psychol. Bull. 2012, 138 (2), 353–387. 32. Greco, J. B. Studio Format General Chemistry: A Method for Increasing Chemistry Success for Students of Underrepresented Backgrounds. In Increasing Retention of Underrepresented Students in STEM Through Affective and Cognitive Interventions; Kishbaugh, T. L. S.; Cessna, S. G. , Eds.; ACS Symposium Series 1301; American Chemical Society: Washington, DC, 2018; pp 131−143. 33. Stone, K. L. S.; Sarah, E.; Fendrick, Carol M. Improving the Success of First Term General Chemistry Students at a Liberal Arts Institution. Educ. Sci. 2018, 8 (1), 5. 34. Elliott, R.; Strenta, C. A.; Adair, R.; Matier, M.; Scott, J. The Role of Ethnicity in Choosing and Leaving Science in Highly Selective Institutions. Res. High. Ed. 1996, 37 (6), 681–709. 35. Gladwell, M. David and Goliath: Underdogs, Misfits and the Art of Battling Giants; Little, Brown and Co.: New York, 2013. 36. Sackett, P. R.; Kuncel, N. R.; Beatty, A. S.; Rigdon, J. L.; Kiger, T. B. The Role of Socioeconomic Status in SAT-Grade Relationships and in College Admissions Decisions. Psychol. Sci. 2012, 23 (9), 1000–1007. 37. Fang, J.; Huang, X.; Zhang, M.; Huang, F.; Li, Z.; Yuan, Q. The Big-Fish-Little-Pond Effect on Academic Self-Concept: A Meta-Analysis. Front. Psychol. 2018, 9, 1569. 38. Marsh, H. W. The Big-Fish-Little-Pond Effect on Academic Self-Concept. J. Educ. Psychol. 1987, 79, 280–294. 39. Marsh, H. W.; Trautwein, U.; Lüdtke, O.; Baumert, J.; Koller, O. The Big-Fish-Little-Pond Effect: Persistent Negative Effects of Selective High Schools on Self-Concept after Graduation. Amer. Educ. Res. J. 2007, 44 (3), 631–669. 40. Marsh, H. W.; Hau, K. T. Big-Fish-Little-Pond Effect on Academic Self-Concept. A CrossCultural (26-country) Test of the Negative Effects of Academically Selective Schools. Amer. Psychol. 2003, 58, 364–376. 41. Smith, H. J.; Pettigrew, T. F.; Pippin, G. M.; Bialosiewicz, S. Relative Deprivation: A Theoretical and Meta-Analytic Review. Person. Soc. Psychol. Rev. 2012, 16 (3), 203–232. 42. Waratuke, S.; Kling, T. Interdisciplinary Research in a Dense Summer Bridge: The Role of a Writing Intensive Chemistry Seminar. J. Chem. Educ. 2016, 93 (8), 1391–1396. 43. Schmid, S.; Youl, D. J.; George, A. V.; Read, J. R. Effectiveness of a Short, Intense Bridging Course for Scaffolding Students Commencing University-Level Study of Chemistry. Int. J. Sci. Educ. 2012, 34, 1211–1324. 44. Garland, E. R.; Garland, H. T. Preparation for High School Chemistry: The Effects of a Summer School Course on Student Achievement. J. Chem. Educ. 2006, 83, 1698–1702. 45. Steele, C. M.; Aronson, J. Stereotype Threat and the Intellectual Test Performance of African Americans. J. Person. Soc. Psychol. 1995, 69, 797–811. 46. Steele, C. M. A Threat in the Air. How Stereotypes Shape Intellectual Identity and Performance. Amer. Psychol. 1997, 52, 613–629. 47. Spencer, S. J.; Steele, C. M.; Quinn, D. Stereotype Threat and Women’s Math Performance. J. Exper. Soc. Psychol. 1999, 35, 4–28.

43

48. Lyle, K. S.; Robinson, W. R. A Statistical Evaluation: Peer-Led Team Learning in an Organic Chemistry Course. J. Chem. Educ. 2003, 80, 132–134. 49. Gafney, L.; Varma-Nelson, P. Peer-Led Team Learning: Evaluation, Dissemination, and Institutionalization of a College Level Initiative; Springer: New York, 2010; Vol. 16. 50. Hockings, S. C.; DeAngelis, K. J.; Frey, R. F. Peer-Led Team Learning in General Chemistry: Implementation and Evaluation. J. Chem. Educ. 2008, 85 (7), 990–996. 51. POGIL: Process Oriented Guided Inquiry Learning; Oxford Univeristy Press: New York, 2008. 52. Farrell, J. J.; Moog, R. S.; Spencer, J. N. A Guided Inquiry General Chemistry Course. J. Chem. Educ. 1999, 76, 570–574. 53. Spencer, J. N. New Approaches to Chemistry Teaching: 2005 George C. Pimentel Award. J. Chem. Educ. 2006, 83, 528–533. 54. Oliver-Hoyo, M.; Beichner, R. SCALE-UP: Bringing Inquiry-Guided Learning to Large Enrollment Classes. In Teaching and Learning Through Inquiry; Lee, V. S., Ed.; Stylus: Serling, VA, 2004; pp 51−79. 55. Gaffney, J. D. H.; Richards, E.; Kustusch, M. B.; Ding, L.; Beichner, R. J. Scaling Up Educational Reform. J. Coll. Sci. Teach. 2008, 37, 48–53. 56. Handelsman, J.; Ebert-May, D.; Beichner, R. J.; Bruns, P.; Chang, A.; DeHaan, R.; Gentile, J.; Lauffer, S.; Stewart, J.; Tilghman, S. M.; Wood, W. B. Scientific Teaching. Science 2004, 304, 521–522. 57. Apple, T.; Cutler, A. The Rensselear Studio Chemistry Course. J. Chem. Educ. 1999, 76, 462–463. 58. Gottfried, A. C.; Sweeder, R. D.; Bartolin, J. M.; Hessler, J. A.; Reynolds, B. P.; Stewart, I. C.; Coppola, B. P.; Holl, M. B. M. Design and Implemention of a Studio-Based General Chemistry Course. J. Chem. Educ. 2007, 84, 265–270. 59. Bailey, C. A.; Kingsbury, K.; Kulinowski, K.; Paradis, J.; Schoonover, R. An Integrated Lecture-Laboratory Environment for General Chemistry. J. Chem. Educ. 2000, 77, 195–199. 60. Kiste, A. L.; Scott, G. E.; Bukenberger, J.; Markmann, M.; Moore, J. An Examination of Student Outcomes in Studio Chemistry. Chem. Educ. Res. Pract. 2017, 18, 233–249. 61. Lundeberg, M. A. Supplemental Instruction in Chemistry. J. Res. Sci. Teach. 1990, 27, 145–155. 62. Rath, K. A.; Peterfreund, A.; Bayless, F.; Runquist, E.; Simonis, U. Impact of Supplemental Instruction in Entry-Level Chemistry Courses at Midsized Public University. J. Chem. Educ. 2012, 89, 449–455. 63. Fink, A.; Cahill, M. J.; McDaniel, M. A.; Hoffman, A.; Frey, R. F. Improving General Chemistry Performance through a Growth Mindset Intervention: Selective Effects on Underrepresented Minorities. Chem. Educ. Res. Pract. 2018, 19, 783–806. 64. Eichler, J. F.; Peoples, J. Flipped Classroom Modules for Large Enrollment General Chemistry Courses: A Low Barrier Approach to Increase Active Learning and Improve Student Grades. Chem. Educ. Res. Pract. 2016, 17, 197–207. 65. Hibbard, L.; Sung, S.; Wells, B. Examining the Effectiveness of a Semi-Self-Paced Flipped Learning Format in a College General Chemistry Sequence. J. Chem. Educ. 2016, 93, 24–30. 66. Genyea, J. Improving Students’ Problem Solving Skills: A Methodical Approach for a Preparatory Chemistry Course. J. Chem. Educ. 1983, 60 (6), 478–482. 44

67. Wink, D. J.; Gislason, S. F.; Zusman, B. J.; Mebane, R. C.; McNicholas, S. D. The MATCH Program: A Preparatory Chemistry and Intermediate Algebra Curriculum. J. Chem. Educ. 2000, 77 (8), 999–1000. 68. Ye, L.; Oueini, R.; Dickerson, A. P.; Lewis, S. E. Learning Beyond the Classroom: Using Text Messages to Measure General Chemistry Students’ Study Habits. Chem. Educ. Res. Pract. 2015, 16, 869–878. 69. Ye, L.; Shuniak, C.; Oueini, R.; Robert, J.; Lewis, S. E. Can They Succeed? Exploring AtRisk Students’ Study Habits in College General Chemistry. Chem. Educ. Res. Pract. 2016, 17, 878–892. 70. Mason, D.; Verdel, E. Gateway to Success for At-Risk Students in a Large-Group Introductory Chemistry Class. J. Chem. Educ. 2001, 78 (2), 252–255. 71. Gutwill-Wise, J. P. The Impact of Active and Context-Based Learning in Introductory Chemistry Courses: An Early Evaluation of the Modular Approach. J. Chem. Educ. 2001, 78 (5), 684–690. 72. Dougherty, R. C.; Bowen, C. W.; Berger, T.; Rees, W.; Mellon, E. K.; Pulliam, E. Cooperative Learning and Enhanced Communication: Effects on Student Performance, Retention, and Attitudes in General Chemistry. J. Chem. Educ. 1995, 72 (9), 793–797. 73. Fullilove, R. E.; Treisman, U. Mathematics Achievement among African American Undergraduates at the University of California, Berkeley: An Evaluation of the Mathematics Workshop Program. J. Negro Educ. 1990, 59 (3), 463–478. 74. Murphy, T. J.; Stafford, K. L.; McCreary, P. Subsequent Course and Degree Paths of Students in a Treisman-Style Workshop Calculus Program. J. Wom. Minor. Sci. Engin. 1998, 4, 381–396. 75. Ioannidis, J. P. A. Why Most Published Research Findings Are False. PLoS Med. 2005, 2 (8), 696–701. 76. Lewis, S. E.; Lewis, J. E. Seeking Effectiveness and Equity in a Large College Chemistry Course: An HLM Investigation of Peer-Led Guided Inquiry. J. Res. Sci. Teach. 2008, 45 (7), 794–811. 77. Hunter, N. W. A Chemistry Prep Course That Seems To Work. J. Chem. Educ. 1976, 53 (5), 301. 78. Walmsley, F. A Course for the Underprepared Chemistry Student. J. Chem. Educ. 1977, 54 (5), 314–315. 79. Murphy, K. Using a Personal Response System To Map Cognitive Efficiency and Gain Insight into a Proposed Learning Progression in Preparatory Chemistry. J. Chem. Educ. 2012, 89 (10), 1229–1235. 80. Meckstroth, W. K. A Chemistry Course for Underprepared Students. J. Chem. Educ. 1974, 51 (5), 329. 81. Botch, B.; Day, R.; Vining, W.; Stewart, B.; Hart, D.; Rath, K.; Peterfreund, A. Effects on Student Achievement in General Chemistry following Participation in an Online Preparatory Course. Chemprep, A Voluntary, Self-Paced, Online Introduction to Chemistry. J. Chem. Educ. 2007, 84 (3), 547–553. 82. Moreno, S. E.; Muller, C.; Asera, R.; Wyatt, L.; Epperson, J. Supporting Minority Mathematics Achievement: The Emerging Scholars Program at the University of Texas at Austin. J. Wom. Minor. Sci. Engin. 1999, 5, 53–66. 45

83. Alexander, B. B.; Burda, A. C.; Millar, S. B. A Community Approach to Learning Calculus: Fostering Success for Underrepresented Ethnic Minorities in an Emerging Scholars Program; University of Wisconsin: Madison, WI, 1996. 84. Alexander, B. B.; Burda, A. C.; Millar, S. B. A Community Approach to Learning Calculus: Fostering Success for Underrepresented Ethnic Minorities in an Emerging Scholars Program. J. Wom. Minor. Sci. Engin. 1997, 3 (3), 145–159. 85. Freeman, M. MathExcel: A Special Opportunity In Calculus; Department of Mathematics, Univeristy of Kentucky, 1995. 86. Duncan, H.; Dick, T. Collaborative Workshops and Student Academic Performance in Introductory College Mathematics Courses: A Study of a Treisman Model Math Excel Program. Sch, Sci. Math. 2010, 100 (7), 365–373. 87. Gandara, P. Priming the Pump: Strategies for Increasing the Achievement of Underrepresented Minority Undergraduates; The College Board: New York, 1999. 88. Gosser, D.; Roth, V.; Gafney, L.; Kampmeier, J.; Strozak, V.; Varma-Nelson, P.; Radel, S.; Weiner, M. Workshop Chemistry: Overcoming the Barriers to Student Success. Chem. Educ. 1996, 1 (1), 1–17. 89. Gosser, D. K.; Roth, V. The Workshop Chemistry Project: Peer-Led Team Learning. J. Chem. Educ. 1998, 72 (2), 185–187. 90. Vygotsky, L. S., Mind in Society: The Development of Higher Psychological Processes; Harvard University Press: Cambridge, MA, 1978. 91. Deslauriers, L.; Schelew, E.; Wieman, C. Improved Learning in a Large-Enrollment Physics Class. Science 2011, 332 (6031), 862–824. 92. Mervis, J. Transformation Is Possible If a University Really Cares. Science 2013, 340 (6130), 292–296. 93. Plant, E. A.; Ericsson, K. A.; Hill, L.; Asberg, K. Why Study Time Does Not Predict Grade Point Average across College Students: Implications of Deliberate Practice for Academic Performance. Contemp. Educ. Psychol. 2005, 30 (1), 96–116. 94. Boeckenstedt, J., “What’s all the fuss about?”. Higher Ed Data Stories. https://highereddatastories.blogspot.com/2013/09/whats-all-fuss-about.html (accessed February 28, 2019). 95. Integrated Postsecondary Education Data System. https://nces.ed.gov/ipeds/ (accessed February 28, 2019). 96. Tro, N. J. Chemistry: A Molecular Approach, 4th ed.; Pearson: New York, 2016. 97. Piazza. https://piazza.com/ (accessed February 28, 2019). 98. McGuire, S. Y.; McGuire, S. Teach Students How To Learn: Strategies You Can Incorporate into Any Course To Improve Student Metacognition, Study Skills, and Motivation; Stylus Publishing: Sterling, VA, 2015. 99. Dweck, C. S. Self-Theories: Their Role in Motivation, Personality, and Development (Essays in Social Psychology); Psychology Press: New York, 2000. 100. Duckworth, A. L.; Quinn, P. D. Development and Validation of the Short Grit Scale (Grit-S). J. Person. Assess. 2009, 91, 166–174. 101. Bauer, C. F. Beyond “Student Attitudes”: Chemistry Self-Concept Inventory for Assessment of the Affective Component of Student Learning. J. Chem. Educ. 2005, 82 (12), 1864–1870. 46

102. Treaster, J. B. “Will You Graduate? Ask Big Data”. New York Times, February 2, 2017, 2017. 103. Credé, M.; Tynan, M. C.; Harms, P. D. Much Ado about Grit: A Meta-Analytic Synthesis of the Grit Literature. J. Person. Soc. Psychol. 2017, 113 (3), 492–511. 104. Paas, F.; Renkl, A.; Sweller, J. Congitive Load Theory and Instructional Design: Recent Developments. Educ. Psychol. 2003, 38, 1–8.

47

Chapter 3

Implementing Metacognitive Writing in a Large Enrollment Gateway Chemistry Class Uma Swamy*,1 and Jennifer Bartman2 1Department of Chemistry and Biochemistry, Florida International University,

Miami, Florida 33199, United States 2Center for Advancement of Teaching, Florida State University,

Tallahasse, Florida 32306, United States *E-mail: [email protected].

Multiple choice exams are an essential part of testing in high enrollment gateway General Chemistry courses. Student performance on these exams has a considerable impact on student engagement, morale, motivation, persistence and even on progress to graduation. A majority of freshmen (and sophomores) are unprepared or underprepared for college and are especially lacking effective study strategies, ability to self-regulate and efficient test taking skills. The need to create self-regulated learners resonates beyond chemistry and is relevant across disciplines because students are more likely to succeed if they can assess their preparation and monitor their learning, evaluate their exam performances and make the requisite adjustments on their own without instructor input. While a majority of the students check their exam grade (and which questions they got right or wrong), only a few examine their performance closely trying to learn from their mistakes. Even then they focus on content skills and not soft skills like study habits, test taking etc. Many faculty use exam wrappers to encourage students to reflect on their exam performance after the exam. In this report, students are also introduced to metacognitive strategies during the semester. They reflect on their learning and study strategies at various points in the semester. These instructordesigned prompts ask students to discuss how they are preparing for an exam and what score they expect, in addition to evaluating and dissecting their exam performance.

At many large universities, general chemistry is a high-enrollment course that most students take in their freshmen year because the course is required for many STEM (Science, Technology, Engineering and Mathematics) degree programs. Student engagement during the first year of college is a determining factor in the rate of persistence (1). Students who do not successfully complete this © 2019 American Chemical Society

course are unlikely to stay in the STEM field (2) with some dropping out of the university altogether. Others choose to retake the class in turn affecting class availability and time to graduation. Florida International University (FIU) is a Hispanic-Serving Institution with a current student population that is 61% Hispanic and 13% African American. A large proportion of the students are commuters, who commute to campus several days a week. Most students attending FIU would fall into the category of non-traditional category as defined by Choy (3): -

They are more likely to work at least part time, many working full time, to pay for college. They are considered financially independent for the purposes of determining eligibility of financial aid They have to balance work and family responsibilities that compete with school work.

Most students often take a heavy courseload each semester because they have unrealistic expectations about how much time they need to devote to each class. Students are often unprepared for the demands of a college course load. In addition, students may also be underprepared to take college-level courses. Students interpretion of the course objectives and understanding of the course content maybe very different from what the instructor envisions (1). Furthermore, they depend on the points breakdown in the syllabus to identify study strategies because they are idealistic about the connection between learning and earning grades. Students seem to think that learning results primarily from investing time and efforts on graded assignments and activities. Even the most academically accomplished and self confident students believe that grades are central in their learning process – grades being their motivation to do well and their benchmark for accomplishment (1). Students tend to think that success in each course is dependent on their understanding of what the instructor plans to assess and the exact assessment procedure. Therefore, students use the grading system for each course as a guide to gauge where to focus their attention and what graded assignments to complete in order to earn a good grade in the course (1). Freeman et al (4) showed that active learning pedagogies increase exam performance, performance on concept inventories and course passing rates. An increasing number of gateway STEM courses employ active learning strategies and other evidence based teaching practices which are more student-centered. At Florida International University, the entire General Chemistry sequence is taught using a flipped format in conjuction with active learning strategies. Students work on activities in class facilitated by trained undergraduate Learning Assistants (LAs). However, a majority of students are not prepared or underprepared for the demands of a curriculum based on active learning because this involves a significant investment of time on the student’s part outside of class (online homework, reading assignments etc). We have observed some students being disappointed with the absence of traditional lecture, they also find the idea of participating in class and an emphasis on critical thinking and problem solving skills challenging. Students have their own opinions and beliefs (often based on prior experience) about what constitutes learning, what kinds of activities are relevant to learning the course material, and how they personally learn best (1). In the traditional classroom, there is very little opportunity for students to engage in contextual control and regulation because most aspects of the tasks and content are controlled by the instructor (5). In more student-centered classrooms, students are offered more autonomy and responsibility where they are expected to control and regulate academic tasks and classroom climate and structure. Since, a lot of the learning takes place outside the classroom it requires students to be able to control, monitor and regulate their study environment for distractions and make it more conducive for studying. In class, students are expected to work together in collaborative and cooperative groups with peers, do 50

various things like solve problems, design projects and experiments, develop classroom norms for discourse and thinking, and sometimes even collaborate with the instructor to determine how they will be evaluated, all of which provide multiple opportunities for contextual control and regulation (5).

Metacognition and Self Regulation According to Rickey and Stacy (6), “metacognition” is described as “thinking about one’s thinking”. In their review, Zohar and Barzilai (7) identified two main components of metacognition: ੦ ੦

metacognitive knowledge - refers to knowledge, beliefs, ideas and theories about cognition and their diverse interactions with cognitive tasks and strategies. metacognitive skillfulness - refers to the skills and processes used to guide, monitor, control and regulate cognition and learning. Self regulation, planning and evaluation are other skills in this category.

Metacognitive ability can allow students to think more strategically and take responsibility for their own learning. Students are more effective learners when they have an arsenal of learning strategies, a willingness to apply them and an awareness and knowledge of the contexts in which these strategies will be effective. They can use these to plan, monitor and manage their own learning (8, 9). A meta-analysis of learning skills interventions revealed that training in learning skills was most effective when it involved promotion of a high degree of metacognitive awareness (10). Successful learners are self-regulated and are able to employ a number of metacognitive processes during the learning process. They elaborate on their existing knowledge, create conceptual relationships and make connections among items, develop self-explanations and monitor their own understanding and comprehension (8, 11). Uzuntiryaki-Kondakci and Capa-Aydin (12) found a significant relationship between metacognitive self regulation and chemistry self-efficacy (both for cognitive skills and everyday applications). They also found that metacognitive self-regulation plays a key role in critical thinking. Students who take responsibility for their own learning and use self regulatory strategies tend to be higher in efficacy, believing in their capabilities to explain fundamental chemistry concepts and connect chemistry with everyday life. Cooper and Sandi-Urena (13) developed and validated the Metacognitive Awareness Inventory, MCAI, an instrument designed to specifically assess metacognitive skillfulness during chemistry problem solving. They suggest using the instrument to identify students who are low on the metacognition scale and those who might be overestimating or underestimating their problem solving abilities. Students who display poor metacognitive performance on actual tasks but believe that they are highly metacognitive may be more resistant to interventions than students who are aware of their limited skills. According to Pintrich (5) There are four general assumptions that most Self-Regulated Learning models share: ੦ ੦

First is the active constructive assumption where learners are viewed as active participants in the learning process who can construct their own meanings, goals and strategies from information available in the external environment as well as internally in their own minds. Second is the potential for control assumption which assumes that learners can potentially monitor, control and regulate certain aspects of their own cognition, motivation and behavior as well as some features of their environments. 51





Third is the goal, criterion or standard assumption, which assumes individuals can set standards or goals to strive for in their learning, monitor their progress towards these goals, and then adapt and regulate their cognition, motivation and behavior to reach these goals. Individuals are able to make comparisons against a standard to assess whether the learning process should continue as is or if a change is necessary and what that change will be. The fourth assumption is that self regulatory activites (cognition, motivation and behavior) are mediators between personal and contextual characteristics and actual achievement and performance.

Pazicni and Bauer (14) demonstrated that the Dunning-Kruger phenomenon applied well to introductory chemistry where students who performed poorly on exams significantly overestimated what they knew and consequently their performance on exams. On the other hand, students who performed well underestimated their performance, but to a lesser degree. They further found that students who harbored illusions of competence on their first exam tended to do so throughout the semester and the mismatch between their actual performance and their perceived performance failed to lead these students towards self-insight. They demonstrated a lack metacognitive awareness about their understanding of chemistry and especially how to learn it. However, there is some evidence that metacognitive training might help these students adopt metacognitive strategies, which inturn can help adjust their perception of competence (15, 16). Additionally, reviewing past performance has been shown to lead to better self assessment (17). Dang et al (18) reported that assignments designed to promote metacognition can have an impact in a single semester and may actually provide the greatest benefit to the lower-performing students. In a two-semester study, the researchers noticed the same pattern as Pazicni and Bauer (14) with more students over-predicting scores than underpredicting scores, with a significantly lower mean for the over-predictors on Exam 1. By Exam 3 under-predictors still scored significantly higher on the exam but they were outnumbered by the over-predictors. The problem arises because students are ill-equipped to perform such selfassessment. Ye et al (19) used text messages to get students to self report study habits in General Chemistry and as expected, students who studied did better than those who did not. Study habits are related to academic performance, especially for students who study more than the mandated course requirements. Students who perceived completion of mandatory online homework alone as a satisfactory preparation tool for the exams, along with reviewing notes and practice exams, did not perform as well as students who believed that additional preparation was necessary including reading the textbook and practicing problems. The researchers pondered whether differences in academic performance may be due to differences in motivation to succeed in the course which in turn can influence study habits like frequency and depth. The study showed that student study habits can change over the course of the semester possibly because students are changing study techniques and choosing resources based on effectiveness, time constraints, exam content and the quality/quantity of resources available as evidenced by a reduction in studying homework in preparation for exams as the semester progressed. A student’s poor performance on the first mid-semester exam can expose problems ranging from inadequate study time to putting in the time but using unproductive study methods (2). This may dissuade students from persisting in the course or in STEM fields. Thomas (20, 21) noted that students must undertake conscious reflection regarding the efficacy of the learning process, the activities and strategies they employ or are directed to employ, in order to develop and enhance metacognitive capabilities. In the classroom setting, learning involves students developing a plan 52

for learning content, monitoring their learning process through reflection and adjusting their plan accordingly. Introducing students to metacognitive learning strategies gives them the opportunity to self regulate and this is especially important to those students who arrive lacking time management and learning skills (22).

Introducing Students to Metacognition and Self Regulation After the first exam, a majority of the students check their grade (and consequently which questions they got right or wrong), however, only a dedicated few examine their performance closely trying to learn from their mistakes. Students focus on content skills and not soft skills like study habits, test taking strategies etc. Their attention and focus is on “what to learn” and not on “how to learn.” Very few students visit the faculty member’s office to discuss their exam and have a conversation about how to improve their performance on the next exam. A majority of students fail to recognize the benefits of exams as a source of instructor feedback and therefore many of them do not pick up their graded exams. Many faculty use exam wrappers (also called cognitive wrappers) which is a structured activity designed to encourage students to self-reflect on their exam performance and examine their study strategies and learning processes with a focus on adjusting them to improve their performance on the next exam (23). Some additional challenges for faculty creating and using exam wrappers are that these activities must impose minimally on instructional class time or on student’s time outside class, be easily adaptable across diverse courses, give students repeated practice but still have variety, flexible, grounded in disciplinary learning and targeted on the metacognitive skills that the instructor wants students to learn (24). Achacoso (25) created the first reflective post-exam questionnaire and reported that when students completed this, they increased their metacognitive skills along with an overall increase in mean scores and an ability to monitor and adjust learing strategies (22, 25, 26). The term “exam wrapper” was first coined by Lovett (24) who suggested that exam wrappers generally ask students three kinds of questions – how they prepared for the exam, what kinds of mistakes they made on the exam and what they might do differently to prepare for the next exam. Metacognition is built into the assignment which also promotes a view of exams as part of the learning cycle – reflect, compare, adjust (24). It is particularly beneficial to students transitioning to college because they are still working at becoming self regulated learners. Gezer-Templetion et al (23) reported that while students demonstrated the ability to create and implement goals to improve study strategies throughout the semester, students with a B exam average benefit most from this type of intervention and most students reported that they planned to continue using this for future courses because they believed it helped them improve their performance and study skills. However, in some other studies exam wrappers did not cause an increase in exam scores (26, 27). Soicher et al (26) speculated that this could be because students did not recognize the value of metacognitive skills or the benefit of the activity itself since it was only being used in one course. They suggested that to be more effective, exam wrappers should be used in more than one course in the same semester as proposed by Lovett (24). It is very important to inform learners about the benefits of metacognition, explain when it is relevant to use metacognition and teach them how to think metacognitively (7). Student motivation to apply learning strategies will depend on their understanding of the possible benefits, if they have knowledge on how to learn and the understanding of how to apply learning strategies (28, 29). Explicit instruction is especially important for the lower achieving students. Cues and prompts may remind learners to activate metacognition but cannot compensate for deficiencies in metacognitive 53

knowledge or metacognitive skills. So learners with low metacognitive abilities need clear instructions on why, what, when and how to use metacognition. Embedding instructions in the content matter in ways that enable the learners to practice metacognition in the class will prove helpful in this regard (7). A metacognition “buy in” talk after the first exam can serve this purpose allowing students to see how this is supposed to help them. If students believe that this will be helpful to their learning, they will participate and also urge their peers to do so. Saundra McGuire’s book Teach Students How to Learn (30) offers some great strategies. A “buy in” talk on the first day of the semester can also be helpful but McGuire makes the point in her book that students will be more receptive after the first exam and this has been the author’s experience too. Cook, Kennedy and McGuire (2) described several learning strategies that directly use metacognitive skills supported by cognitive science or learning support research: ੦

੦ ੦ ੦



Paraphrasing or rewriting lecture notes ensures that in the process of rewriting information in their own words, students are accessing what they already know about the topic, actively constructing meaning, making connections between new and existing information and forming cues for retrieval for future use. Working homework problems without using an example allows students to learn the underlying concept and develop independent problem solving skills sans websites or textbook answers and perform confidently on exams. Previewing the content before it is covered in class primes the brain for learning, allowing students to be more engaged and retain more information. Group study allows students to evaluate each other’s thinking and adjust their own mental models as well as those expressed by others. Hearing different opinions, points of view and thoughts about a topic increases a student’s understanding, making them more likely to be metacognitive in how they approach problems. Teaching information (or pretending to) to a live or imagined audience allows students to discover if they know what they think they know, allowing them an opportunity to work on this before the exam.

Cook, Kennedy and McGuire (2) implemented a one-day lecture for General Chemistry students where they presented differences in expectations between high school and college courses and also other metacognitive tools including a study cycle. Students who attended the treatment performed better on the measurable outcome than those who did not. These researchers concluded that students can be taught these learning strategies to succeed in general chemistry in 50 minutes. Self regulated learning is an active and constructive process wherein students regulate different cognitive, notivational, volitional and behavioral processes during learning (31). Pintrich (5) presented a framework for classifying the different phases and ways for regulation. Phase 1 is forethought, planning and activation process that involves planning and goal setting, task perception, prior content knowledge activation, and metacognitive knowledge activation. Phase 2 is monitoring process that involves metacognitive awareness and monitoring of cognition. Phase 3 is control process that involves selection and adaptation of cognitive strategies for learning and thinking. One of the central aspects of the control and regulation of cognition is the selection and adaptation of various cognitive strategies for memory, learning, reasoning, problem solving and thinking (examples include rehersal, elaboration and various other organizational strategies that learners can use). Phase 4 is reaction and reflection. 54

An intervention in the form of a talk, a presentation or a whole class discussion could serve as the first step to get the ball rolling by exposing students to information about how to plan, monitor and regulate their cognition. These include setting specific targets or cognitive goals for learning, activating any specific knowledge students might have about the task or themselves, building awareness to monitor their progress towards their goals with respect to both learning and comprehension to be able to make any adaptive changes towards their learning (5). This can be achieved using direct training like explicit metacognitive training such as a classroom talk or indirect training using metacognitive prompts that allow students to carry out specific instructions or a combination of both (32). Direct training is necessary for students lacking metacognitive competence in order to extensively teach both metacognitive knowledge and skills. Most general chemistry students would fall into this category. Just knowing about metacognitive knowledge and skills will not allow these students to apply them spontaneously. Metacognitive prompts then serve as indirect training to stimulate students to apply these skills during learning. Bannert et al (32) also outline three general principles for effective metacognitive instruction – one to integrate and embed metacognitive instruction into the domain-specific instruction (subject matter teaching), two to explain the application and usefulness of all instructional metacognitive strategies used so that students apply them spontaneously later on, and finally to provide enough training time to students in order to implement and automatize the metacognitive activites that have been learned (32).

Using Writing to Promote Learning Lin (33) described metacognition as the ability to monitor and understand one’s own thoughts and the assumptions and implications of one’s activities. Learning is enhanced when students engage in metacognitive activities like self-assessment, self explanation, monitoring or revising. Weaker students benefit more than stronger students as they become more effective learners who aware of their strengths and limitations and are able to find ways to mitigate those limitations. Students do not spontaneously engage in metacognitive thinking unless explicitly encouraged to do so using carefully designed instructional activities and therefore it is important to include metacognitive support in the design of learning environments (33). Activities designed to prompt students to monitor their own learning should give us a window into how students learn and how they “think” they learn. Writing affords one of the most effective means for instructors to see a student’s processes of thinking, self-reflection and self-monitoring. According to Zohar and Barzilai (7) Metacognitive writing can encompass journal writing, writing reports or short reflections in which learners have opportunities to reflect on, describe and analyze their learning and thinking. They reported that using reflective writing to foster metacognition was the second frequently recurring metacognitive instructional practice (the first being the use of metacognitive prompts during science instruction). They also defined metacognitive prompts as questions, cues or probes that are introduced in writing, by the teacher, by student peers, or in a computerized environment, with the aim of activating and fostering metacognitive thinking skills in the course of learning. Bannert and Riemann (31) defined instructional prompts and instructional prompting as procedures to induce and simulate cognitive, metacognitive, motivational, volitional and/or cooperative activities during learning. Instructional prompts, also referred to in the literature as “procedural prompts” or “reason justification prompts”, are aimed at focusing a student’s attention on specific aspects of the learning process and may simulate the recall of concepts and procedures, tactics, and techniques during learning. Because the prompts support the recall and execution of knowledge and skills rather than teaching new 55

information (31), they can even induce the use of cognitive and metacognitive learning strategies as well as strategies of resource management that differ from prototypical instruction approaches. Bannert and Riemann (31) define self-regulatory prompts as instructional procedures that are embedded within a learning context and ask students to carry out specific self-regulated learning activities where they explicitly reflect, monitor and revise their learning process. These prompts assume students already possess metacognitive knowledge and skills, but do not recall them or are unable to execute them spontaneously (25, 31). Engaging students in more reflective and strategic behavior should positively influence the learning process which in turn increasing students’ academic performance. However, just offering tools and scaffolding prompts to improve self-regulated learning is not always sufficient, and it is necessary to ensure that the prompts are utilized more frequently and in the intended manner. It is also important that the students are metacognitively competent before utilizing the prompts. The lack of metacognitive competency may be one reason why instruction in self-regulated learning has not shown positive effects on learning outcomes and academic performance (31). Therefore, every attempt should be made to provide direct training (26) in metacognitive knowledge and metacognitive skills to the students before utilizing the prompts. According to Nuckles et al (34) a learning protocol is a written explanation of one’s own learning processes and outcomes and when this happens over an extended period of time, it is referred to as a learning diary. The terms “learning diary”, “learning journal” or “log book” are frequently used interchangeably in Education Psychology. Their primary purpose is to promote reflection – either a reflection of the learning content also referred to as knowledge about a specific domain (e.g. the content of a textbook, science, mathematics, reading, comprehension, writing skills or problem solving) or a reflection of the learning behavior itself also referred to as knowledge about self as learner– which can be used to monitor student progress and assess learning (28, 33). Most research studies focus on only one of these aspects. Learning protocols combined with cognitive and metacognitive prompts have been shown to improve students’ understanding of learning content as well as learning strategies (28). When McCrindle and Cristensen (35) provided a group of students with an opportunity to reflect on their learning process within the context of their Biology course by writing learning journals, students showed greater metacognitive awareness, an ability to implement metacognitive strategies effectively and thereby greater control of their learning process as compared to the control group which wrote a scientific report on the material they were learning. These students also used more active and transformative cognitive strategies while engaged in a learning task and outperformed the control group on the final examination for that course despite having fewer opportunities to interact with the course content (28). McCrindle and Cristensen stated that providing students with opportunities to deliberately reflect on their own learning and cognitive processes can meaningfully improve academic performance due to the fact that the reflection process may well exert a continued influence on the students’ learning. This is because they developed a more sophisticated understanding of the concept of learning and understanding of its purpose and the processes of learning, which in turn transformed their underlying views and beliefs about the nature of learning (35). However, Fabriz et al (28), cautioned that the use of learning diaries is more effective when combining with an intervention on self-regulated learning and in conjuction with activities that inform students about “how to learn” so that students can apply this knowledge to reflect on their learning behavior in their writings. They also emphasized the importance of informing students about the benefits of keeping a learning diary because students will be able to apply the learned strategy better if they know about its benefits (28, 29). Though they were unable to find any effect of diary use 56

on academic achievement, Fabriz et al (28) suggested that since the use of self-regulation strategies improved during the course of the semester, it is possible that it may take some time for the positive effects of using these strategies to be seen. This is because students need to internalize the application of the newly learned strategies before we can see a noticeable effect on academic achievement. With time, students gain experience, and their use of strategy become more automated and sophisticated. Dignath and Büttner (29) believe that longer interventions will allow for intensive acquisition and practice of self-regulated learning strategies. Berthold et al (36) reported that learning journals generally led to enhanced learning outcomes and they argued that providing general and unspecific instuctions for writing a learning protocol was not sufficient. These researchers defined prompts as questions or hints that induce productive learning processes that can encourage learners to apply more enhanced cognitive and metacognitive strategies, which should eventually lead to more favorable learning outcomes and higher accuracy in self-assessment. They suggested that we think of these prompts as strategy activators because they induce learning strategies that learners are, in principle, capable of, but do not spontaneously demonstrate or demonstrate to an unsatisfactory degree. They reported that learners who received cognitive prompts or mixed prompts (cognitive and metacognitive) significantly outperformed the group with no prompts and all participants who received prompts of any kind (cognitive, metacognitive or mixed) showed a higher degree of metacognitive learning strategies in their learning protocols. Therefore, they suggested that providing cognitive and metacognitive prompts is a very effective method to foster cognitive and metacognitive learning strategies in writing learning protocols. The researchers also reported that the elicitation of cognitive learning strategies, and mixed condition of cognitive and metacognitive strategies strongly fostered learning outcomes, whereas metacognitive strategies alone were not helpful in improving learning outcomes. In order for the readers to clearly see the differences in cognitive, metacognitive, and mixed strategies/prompts, examples (36) are shown in Table 1.

Table 1. Types of prompts Type of prompt

Example

Remarks

Cognitive prompts

How can you best organize the structure of the learning content? Which aspects of the learning material do you find interesting, useful, convincing and which not?

Prompt to elicit organizational strategies Prompt to elicit elaboration strategies

Metacognitive prompts

Which questions in my opinion were not sufficiently clarified by the lecture video? What possibilities do I have to overcome my comprehension problems?

Monitoring and self-diagnosis prompt Self-regulation prompt

Mixed prompt (cognitive and metacognitive prompts together)

What are the main points in your opinion? Cognitive (organizational) prompt Which main points have I already understood Metacognitive (monitoring and selfdiagnosis) prompt well? Which main points haven’t I understood yet?

57

Berthold et al (36) explained that cognitive learning processes like organization and elaboration strategies enable a deep comprehension and retention of the learning contents whereas metacognition referes to the knowledge and awareness of one’s own cognitive processes and the ability to actively control and manage these processes. If learners show productive metacognitive processing, the elicitation of metacognitive strategies during the production of a learning protocol can help to prevent illusions of understanding and trigger remedial cognitive activities (30). Cognitive and metacognitive strategies are not independent of each other but are complementary to each other and therefore these researchers proposed that metacognitive learning strategies have to be performed with cognitive learning strategies to foster learning and metacognitive strategies. For example, monitoring one’s understanding may especially be more helpful when subsequent remedial cognitive strategies close the detected gaps in one’s knowledge. Berthold et al (36) also recommended that in order to increase the likelihood of student engagement and continued application, learners should be informed in detail what and why such learning strategies are effective. Such a procedure should be termed informed prompting where in addition to the prompts themselves, a short text providing background information about the learning strategies to be elicited (possibly with empirical evidence) be given to the learners. Making the learners cognizant of the advantages of certain strategies would strengthen their awareness of the benefits of these strategies and this “informed training” would in turn lead to successful maintainence of the acquired learning strategies (36). Peters et al (37) reported a significantly higher gain in content gain and nature of science knowledge for the experimental group receiving metacognitive prompts than the control group. The experimental group gained a more sophisticated understanding of the science content, in addition to understanding of the nature of science. Peters et al (37) attributed the increase in learning outcomes in their study to two factors – the first was the design of the science inquiry activity in which the prompts were embedded into the content itself, and second, following the book of Berthold et al (36), the use of a combination of cognitive and metacognitive prompts (mixed prompts) in the experimental group while the control group was only given cognitive prompts. Students in the experimental group reported that they no longer studied by rote memorization because they recognized the interconnectedness of the material after completing the activities (37). In a meta-analysis of the efficacy of 48 school-based Writing to Learn (WTL) programs, Bangert-Downs et al (38), showed that writing can have a small positive impact on academic achievement and that two factors predicted enhanced effects – the use of metacognitive prompts and increased treatment length. The authors also found that WTL interventions that included prompts requiring students to reflect on current understandings, confusions, feelings on the subject matter and learning processes proved particularly effective in improving performance in STEM courses. Furthermore, it was suggested that these writing interventions did not need to be elaborate. Ramirez et al (39) conducted a study where they had students with math anxiety write about their testing worries before a math exam while a control group wrote about something else. They found that the experimental group could significantly improve exam scores, especially for the students who suffered from test anxiety. Park et al (40) reported that expressive writing reduced the performance gap between high math anxious individuals (HMA) and low math anxious individuals (LMA) as compared to a control group. Simply asking students to write about their thoughts and feelings before a high stakes math test helps reduce the impact that math anxiety has on student performance and has the potential to help students who struggle with math anxiety demonstrate their true competency during high stakes mathematics examinations. 58

The First Set of Prompts—Starting the Semester on the Right Note The first metacognitive writing prompts are assigned the first week of the semester to get students to think about their feelings towards the subject of chemistry, and their confidence in themselves to succeed in the course. Fear of failure and the unexpected outcome is an impediment to engagement and commitment to the course, as a lot of students are afraid of chemistry. Our hope is that if they reflected on this fear and wrote about it, it would help them overcome the fear and allow them to be more engaged in the subject. It also gives us a sense of their attitude towards, and preparation for, the subject. Cox (1) points out that it becomes the instructor’s responsibility to understand how students perceive the content, and if necessary, help revise student perceptions in a way that harmonizes them with the instructor’s vision of learning. The students had to answer three questions by the end of the first week: 1. How do you feel about chemistry as a subject and why? Please explain your answer. 2. How prepared do you feel to take general chemistry? Please explain your answer. 3. Describe the strategies that have helped you succeed in chemistry (or other science courses) in the past. Please explain your answer. Traditionally, students have been hindered by tremendous fear and anxiety about chemistry and doubts about their ability to succeed in the course. In the author’s courses, students have been very open about their feelings and will write things in this reflection that they would never confess to anyone. Reading these reflections allows the instructor to understand the fears and anxieties of the students and sometimes the sources of these anxieties/fears. These prompts also allow an instructor the ability to openly discuss and allay these fears and anxieties. An open discussion in class summarizing the results allows students to realize that they are not alone in feeling this way and that many of their classmates are in the same place. This is a great opportunity to encourage the students to work with each other and help each other succeed. Instructors can also revisit the expectations discussed on the first day of class and persuade students that they have the ability to accomplish the work, fostering the perception that the coursework is challenging but “doable”. Validation is powerful and sets the right tone for the rest of the semester.

The Second Set of Prompts—Reflection on Preparation for the First Midterm The second set of metacognitive prompts are assigned just before the first midterm examination and are meant to help students reflect on their preparation for this exam and also predict their performance by answering the following questions: 1. How are you preparing for the exam and everything you expect to see on the exam? Please include as much details as possible – resources used, time spent studying, how you studied etc. 2. What do you expect to see on the exam? (This deals with kinds of questions, content and anything else that you think is applicable in this regard.) 3. How confident do you feel about your performance on the upcoming exam? Please explain your answer in as much detail as possible. 4. Predict your score on Exam 1 and explain why you think you will receive that score. The answers to these questions enable an instructor to assess student expectation and to tailor their post-exam discussion to address what the students expected to see on their exams, what 59

resources they were primarily using, how much time they spent preparing for the exam and what this time was dedicated towards.

The Third Set of Prompts—Evaluating Performance after the First Midterm We designed the rubric shown in Figure 1 as a metacognitive activity to help students analyze the exam. It is intended to get the students to evaluate which resources each of the questions on the exam came from, if they got that question right, why or why not, and discuss which level of Bloom’s each question would be sorted into. Bloom’s taxonomy is a simple method to allow college freshmen to decipher that most of the questions on the exam are not based on memorization and are instead based on analysis and problem solving. After completing this process for all the exam questions, at the end of the worksheet students answer two questions which constitute the third set of metacognitive prompts. 1. What other sources do you think some of the exam questions might have come from that were not covered in this assignment? 2. What did you learn about studying for the next exam? Students rely too heavily on old exams or practice exams as students refer to them and fail to use the multitude of other resources that a faculty member may make available to them to prepare for the examination. This rubric allows students to recognize the availability of these resources, understand how their instructor uses these resources to create exam questions and what kinds of questions are on the exam (analysis vs memorization). They will also hopefully begin the process of thinking about how to use the available resources to prepare for these kinds of questions in the future. In answering the second question, the student puts all their newly learned skills to good use to amalgamate how they prepared for the last exam, their score on the exam and all the new information they received after the exam to formulate a plan for either improving or maintaining their grade in the course.

The Fourth Set of Prompts—Starting to Reflect on Their Performance after the First Midterm In the fourth set of metacognitive prompts, the students reflect on what they learned after completing the rubric in Figure 1. The rubric facilitates metacognition by allowing them to identify the origin of the questions on the exam, and how best to utilize these same resources while preparing for the next exam. 1. What did you learn from comparing the questions on the exam to your course workbook/ textbook/other worksheets? Please explain in detail. 2. Are you planning to change the way you study based on what you learned from completing the rubric? Please explain your answer in detail. 3. What kind of questions did you have trouble with - conceptual questions or numerical questions or both? With what you have learned, how could you go about trying to study for this type of question in the future? Many students are surprised to know that some questions on the exam are derived from the back of the chapter or that we slightly altered a problem from their workbook/ textbook/ homework to include on the exam. Once students realize this, it gives them an opportunity to review and adjust 60

(or fine tune) their study strategies and come up with a plan on how they will use the available resources to help prepare for the next exam. The second question specifically seeks to have the student evaluate whether they have issues with either numerical questions (possibly pointing to difficulties with mathematical calculations) or conceptual questions (possibly pointing to reading comprehension or incomplete mental models for a particular topic). This allows them to focus their energies.

Figure 1. Take Home Assignment that allows the students to metacognitively assess the origins of each question on the exam, if they had seen a similar question in one of their available resources and their response to that question.

The Fifth Set of Prompts—Reflecting on Their Performance as a Whole after the First Midterm In the fifth set of metacognitive prompts, the students reflect on their performance on Exam 1. 1. How did the grade you earned compare with the grade you expected to earn? Was the grade you earned higher or lower than what you expected? By how much? Were you satisfied with your grade? Why or why not? 2. Do you think this test accurately measured your learning so far in the course? Why or why not? What (if anything) did the test cover that you did not expect? What (if anything) did you learn that was not on the test? Please be general and give me specifics as well. (This helps me see how you allocate your time and also helps me understand your thought process). 3. What kind of questions did you have trouble with - conceptual questions or numerical questions or both? Why was that particular type of question challenging to you? How could you go about trying to study for this type of question in the future? 4. You were asked to evaluate Exam 1 based on the principles of Bloom’s taxonomy. Did it change your perspective on how to study for future exams? Please explain how. 61

Most students take this opportunity to be insightful about what happened and show evidence of deep metacognitive thinking in order to improve or maintain their performance on the next exam. This is also the place where the instructor gets to see if the students exhibits characteristics that Pazicni and Bauer (14) demonstrated and assess if students are taking responsibility for their performance on the exam. The instructor also gets to evaluate if students are exhibiting metacognition and reflecting on where their weaknesses were and whether they have a plan to improve on these limitations. In her book Teach Students How to Learn (30), Saudra McGuire explains how getting the students to code questions on the exam using Bloom’s taxonomy helps students understand that chemistry exams do not just evaluate students on plain memorization, but in fact test their analytical and problem solving capabilities. This is a very simple tool for college freshmen to use and allows the instructor to encourage the students to develop study skills that move well beyond rote memorization.

The Sixth Set of Prompts—Reflection on Preparation for the Second Midterm Using the Experiences from the First Midterm By the time the second midterm examination in the course rolls around, students have hopefully had enough exposure to metacognitive prompts and strategies that they can begin to apply what they learned towards their preparation for the seond midterm. The sixth set of metacognitive prompts is assigned the week before Exam 2 and has some of the same prompts as the second set (since both are assigned before exams). However, students are still guided towards self-regulation and monitoring their study skills. 1. How do you predict you will perform on exam 2? Explain why you feel this way. 2. Is there a difference in how you’re preparing for exam 2 compared to how you prepared for exam 1? If so, what did you change about your approach and why? If you did not change anything, why is that? Please explain how your approach is effective? 3. Based on your experience in studying for and taking exam 1, what did you learn that will help you prepare you for Exam 2? What advice would you give someone who comes to you seeking help. The second question is written in a manner that compels the student to revist their preparation for and performance on the first midterm examination. In addition, it encourages them to describe what they changed and explain why they changed something or did not change anything. By asking them to defend their approach, we are compelling them to take a closer look at their study strategies and reassess their effectiveness. Students have access to their old reflections through out the semester and we are gently nudging them to go back and check those out, if they have forgotten, and commit to a plan of improvement and implement it. In the third question, asking a student to give someone else advice is an effective strategy to get them to self reflect about what they need to do to improve their performance on the exam. Students have mentioned that when they type out the answer giving advice to someone else, they read it and think to themselves that this is great advice and that they should follow it too.

Instructor Notes and Final Thoughts Many researchers (7, 36, 28, 29) pointed out that though cues and prompts may be a trigger for learners to initiate metacognition, students with low metacognitive abilities need clear instructions 62

on how to use metacognition in a given situation. So the questions need to be clear and specific with respect to the objective. Providing prolonged practice and training will ensure the metacognitive activites are viewed favorably by the students and enable smooth implementation of these activities (7). Therefore we created multiple sets of prompts for the students to complete over several weeks. Students are also exposed to the same prompts in General Chemistry 1 and then again in General Chemistry 2, allowing them to get comfortable with the idea of “writing to learn about learning” itself. Integrating metacognitive assignments into the fabric of course organization and making it a part of the grading scheme ensures that students get the message about these activites being important. Even if the activities are worth a very small portion of the overall grade in the class (less than 0.5%), it will still be helpful because it signals to the students that this is important to the instructor. The very first semester, we made the mistake of choosing to present them as extra-credit assignments and it was mostly high performing students who did the assignments, with barely any low performing students attempting them. In subsequent semesters, even with the assignments being worth a very small portion of the overall grade, we have a significantly higher proportion of the low performers completing the assignments and benefitting from them. However, messaging is crucial and transparency is key for students to understand the instructor’s intent in assigning these prompts and what the students can hope to gain by it. The students must be assured that the reflections are private and only the instructor will have access to them. The students also need to understand that this is not just “busy work” and that there are substantial benefits to engaging in this behavior. Encouraging students to be metacognitive during every class meeting and frequently modeling how to self-regulate and evaluate where they stand with their understanding is very helpful. Embedding instructions in the content matter in ways that enable the learners to practice metacognition in the class will prove helpful in this regard. An example could be that during problem solving activities in class, the instructor could have students evaluate all the possible sources of error in a problem and how to mitigate them. Another possibility is to have the students do a “one-minute paper (41)” writing down something they learned by doing the problem, an issue related to the problem or the concept that they know they need to work on and what steps they plan to take to fix this issue. The students could take a picture of this on their phones before they hand in the assignment to the instructor. The responses allow the instructor the assess where the students stand on self-regulation and how seriously they are taking it. It also allows the instructor to see where students are having problems and plan their next class accordingly. In the author’s sections of general chemistry, metacognitive writing assignments are assigned before and after every exam as a homework assignment, so students know and expect an assignment before and after each exam. All the metacognitive activities are assigned as quizzes on the Learning Management System with an essay answer being allowed for each question. After the due date, the instructor goes through the responses, primarily grading them for completion. If a particular reflection needs further follow up, the instructor reaches out to that student via email. The instructor finds it useful to summarize some of the common themes that emerge from the reflections allowing students to see that some of their classmates are having the same experiences as them and creating a sense of camaraderie. Presenting the metacognition and self-regulation “buy-in” talk immediately after the grades of first mid-term exam have been found to ensure student engagement, especially if the instructor’s message is that the next few exams will be more challenging. Following this up immediately with assigning the rubric on the same day has had a synergistic effect on the motivation of students to work towards a better grade on the next exam. The metacognitive rubric from the first exam (Figure 1) remains on the course learning management system (LMS) where the students can continue to have access to it. Students can also access to all their previous reflections through 63

out the semester (but cannot make changes to them). Students are not required to report their use of the metacognitive activities for the remaining exams, but they are encouraged to go through the same process that they did for Exam 1, using the previous instructor prompts to reflect on their performance from Exam 2, preparation for Exam 3 and later on their performance on Exam 3 in preparation for the cumulative final exam. As far as grading the rubric goes, the Learning Assistants (LAs) work together to determine the workbook and textbook pages and create a key. They then go through the rubrics for the students assigned to them and check for completion. They pick a few questions at random and check to see if the page numbers for the textbook and the workbook are in agreement with the key.

Conclusion The need to create self-regulated learners with metacognitive knowledge and skills is a necessity resonates beyond chemistry and is relevant across all discipline. In order to facilitate this, instructors need to build metacognitive activites into their courses and encourage students to routinely engage in metacognitive activities until it becomes second nature to the students. Unfortunately, in large classes, it is very difficult to give individualized feedback and guidance to each student. But this strategy can be an effective technique to train students to assess their preparation, predict their grades, evaluate their exam performances and make the requisite adjustments on their own with limited instructor input. It is not necessary that all the prompts must be used together or used in the same sequence. They can also be altered to fit the unique student body of a particular course or institution. Hopefully, this will allow our students to develop and utilize their metacognitive abilities to succeed in their gateway courses, stay in the STEM or STEM related fields, remain on track to complete their degrees in a timely manner and succeed in their chosen careers.

References 1. 2. 3. 4.

5.

6. 7.

Cox, R. D. The College Fear Factor: How Students and Professors Misunderstand One Another; Harvard University Press: Cambridge, MA, 2009. Cook, E.; Kennedy, E.; McGuire, S. Y. Effect of Teaching Metacognitive Learning Strategies on Performance in General Chemistry Courses. J. Chem. Educ. 2013, 90 (8), 961–967. Choy, S. P. Nontraditional Undergraduates, The Condition of Education. 2002, https://eric.ed. gov/?id=ED471742. Freeman, S.; Eddy, S. L.; McDonough, M.; Smith, M. K.; Okoroafor, N.; Jordt, H.; Wenderoth, M. P. Active Learning Increases Student Performance in Science, Engineering, and Mathematics. Proc. Natl. Acad. Sci. 2014, 111 (23), 8410–8415. https://doi.org/10.1073/ pnas.1319030111. Pintrich, P. R. A Conceptual Framework for Assessing Motivation and Self-Regulated Learning in College Students. Educ. Psychol. Rev. 2004, 16 (4), 385–407. https://doi.org/10.1007/ s10648-004-0006-x. Rickey, D.; Stacy, A. M. The Role of Metacognition in Learning Chemistry. J. Chem. Educ. 2000, 77 (7), 915. https://doi.org/10.1021/ed077p915. Zohar, A.; Barzilai, S. A Review of Research on Metacognition in Science Education: Current and Future Directions. Stud. Sci. Educ. 2013, 49 (2), 121–169. https://doi.org/10.1080/ 03057267.2013.847261.

64

8. 9. 10. 11.

12.

13.

14.

15.

16.

17.

18.

19.

20.

21.

22.

Nilson, L. B. Creating Self-Regulated Learners: Strategies to Strengthen Students’ Self-Awareness and Learning Skills; Stylus Publishing: Sterling, VA, 2013. Brown, P. C.; Roediger, H. L., III; McDaniel, M. A. Make It Stick: The Science of Successful Learning, 1st ed.; Belknap Press: Cambridge, MA, 2014. Hattie, J.; Biggs, J.; Purdie, N. Effects of Learning Skills Interventions on Student Learning: A Meta-Analysis. Rev. Educ. Res. 1996, 66 (2), 99–136. Reynolds, J. A.; Thaiss, C.; Katkin, W.; Thompson, R. J. Writing-to-Learn in Undergraduate Science Education: A Community-Based, Conceptually Driven Approach. CBE—Life Sci. Educ. 2012, 11 (1), 17–25. https://doi.org/10.1187/cbe.11-08-0064. UzuntiRyaki-Kondakçi, E.; Çapa-Aydin, Y. Predicting Critical Thinking Skills of University Students through Metacognitive Self-Regulation Skills and Chemistry Self-Efficacy. Educational Sciences Theory and Practice 2013, 13 (1), 666–670. Cooper, M. M.; Sandi-Urena, S. Design and Validation of an Instrument To Assess Metacognitive Skillfulness in Chemistry Problem Solving. J. Chem. Educ. 2009, 86 (2), 240. https://doi.org/10.1021/ed086p240. Pazicni, S.; Bauer, C. F. Characterizing Illusions of Competence in Introductory Chemistry Students. Chem. Educ. Res Pract. 2014, 15 (1), 24–34. https://doi.org/10.1039/ C3RP00106G. Kruger, J.; Dunning, D. Unskilled and Unaware of It: How Difficulties in Recognizing One’s Own Incompetence Lead to Inflated Self-Assessments. J. Pers. Soc. Psychol. 1999, 77 (6), 1121–1134. Kruger, J.; Dunning, D. Unskilled and Unaware--but Why? A Reply to Krueger and Mueller (2002). J. Pers. Soc. Psychol. 2002, 82 (2), 189–192. https://doi.org/10.1037/0022-3514.82. 2.189. Dunning, D.; Heath, C.; Suls, J. M. Flawed Self-Assessment: Implications for Health, Education, and the Workplace. Psychol. Sci. Public Interest 2004, 5 (3), 69–106. https://doi. org/10.1111/j.1529-1006.2004.00018.x. Dang, N. V.; Chiang, J. C.; Brown, H. M.; McDonald, K. K. Curricular Activities That Promote Metacogntive Skills Impact Lower Performing Students in an Introductory Biology Course. J. Microbiol. Biol. Educ. 2018, 19 (1), 1–9. Ye, L.; Oueini, R.; Dickerson, A. P.; Lewis, S. E. Learning beyond the Classroom: Using Text Messages to Measure General Chemistry Students’ Study Habits. Chem. Educ. Res. Pract. 2015, 16 (4), 869–878. https://doi.org/10.1039/C5RP00100E. Thomas, G. P. Student Restraints to Reform: Conceptual Change Issues in Enhancing Students’ Learning Processes. Res. Sci. Educ. 1999, 29 (1), 89. https://doi.org/10.1007/ BF02461182. Thomas, G. P. Metacognition in Science Education: Past, Present and Future Considerations. In Second International Handbook of Science Education; Fraser, B. J., Tobin, K., McRobbie, C. J., Eds.; Springer International Handbooks of Education; Springer Netherlands: Dordrecht, 2012; pp 131–144. https://doi.org/10.1007/978-1-4020-9041-7_11. Zhao, N.; Wardeska, J.; McGuire, S.; Cook, E. Metacognition: An Effective Tool to Promote Success in College Science Learning. J. Coll. Sci. Teach. 2014, 43 (04), 48–54.

65

23. Gezer-Templeton, P. G.; Mayhew, E. J.; Korte, D. S.; Schmidt, S. J. Use of Exam Wrappers to Enhance Students’ Metacognitive Skills in a Large Introductory Food Science and Human Nutrition Course: Use of Exam Wrappers to Enhance. J. Food Sci. Educ. 2017, 16 (1), 28–36. https://doi.org/10.1111/1541-4329.12103. 24. Lovett, M. C. Make Exams Worth More Than the Grade. In Using Reflection and Metacognition to Improve Student Learning; Kaplan, M., Silver, N., Lavaque-Manty, D., Meizlish, D., Ed.; Stylus Publishing; pp 18–52. 25. Achacoso, M. V. Post-Test Analysis: A Tool for Developing Students’ Metacognitive Awareness and Self-Regulation. New Dir. Teach. Learn. 2004, 2004 (100), 115–119. https://doi.org/10. 1002/tl.179. 26. Soicher, R. N.; Gurung, R. A. R. Do Exam Wrappers Increase Metacognition and Performance? A Single Course Intervention. Psychol. Learn. Teach. 2017, 16 (1), 64–73. https://doi.org/10.1177/1475725716661872. 27. Chew, K. J.; Chen, H.; Rieken, B.; Turpin, A.; Sheppard, S. Improving Students’ Learning in Statics Skills: Using Homework and Exam Wrappers to Strengthen Self-Regulated Learning. In 2016 ASEE Annual Conference & Exposition Proceedings; ASEE Conferences: New Orleans, Louisiana, 2016; p 25633. https://doi.org/10.18260/p.25633. 28. Fabriz, S.; Dignath-van Ewijk, C.; Poarch, G.; Büttner, G. Fostering Self-Monitoring of University Students by Means of a Standardized Learning Journal—a Longitudinal Study with Process Analyses. Eur. J. Psychol. Educ. 2014, 29 (2), 239–255. https://doi.org/10.1007/ s10212-013-0196-z. 29. Dignath, C.; Büttner, G. Components of Fostering Self-Regulated Learning among Students. A Meta-Analysis on Intervention Studies at Primary and Secondary School Level. Metacognition Learn. 2008, 3 (3), 231–264. https://doi.org/10.1007/s11409-008-9029-x. 30. Saundra Yancy McGuire. Teach Students How to Learn: Strategies You Can Incorporate Into Any Course to Improve Student Metacognition, Study Skills, and Motivation; Stylus Publishing, 2015. 31. Bannert, M.; Reimann, P. Supporting Self-Regulated Hypermedia Learning through Prompts. Instr. Sci. 2012, 40 (1), 193–211. https://doi.org/10.1007/s11251-011-9167-4. 32. Bannert, M.; Hildebrand, M.; Mengelkamp, C. Effects of a Metacognitive Support Device in Learning Environments. Comput. Hum. Behav. 2009, 25 (4), 829–835. https://doi.org/10. 1016/j.chb.2008.07.002. 33. Lin, X. Designing Metacognitive Activities. Educ. Technol. Res. Dev. 2001, 49 (2), 1042–1629. 34. Nückles, M.; Schwonke, R.; Berthold, K.; Renkl, A. The Use of Public Learning Diaries in Blended Learning. J. Educ. Media 2004, 29 (1), 49–66. https://doi.org/10.1080/ 1358165042000186271. 35. McCrindle, A. R.; Christensen, C. A. The Impact of Learning Journals on Metacognitive and Cognitive Processes and Learning Performance. Learn. Instr. 1995, 5 (2), 167–185. https://doi.org/10.1016/0959-4752(95)00010-Z. 36. Berthold, K.; Nückles, M.; Renkl, A. Do Learning Protocols Support Learning Strategies and Outcomes? The Role of Cognitive and Metacognitive Prompts. Learn. Instr. 2007, 17 (5), 564–577. https://doi.org/10.1016/j.learninstruc.2007.09.007. 37. Peters, E. E.; Kitsantas, A. Self‐regulation of Student Epistemic Thinking in Science: The Role of Metacognitive Prompts. Educ. Psychol. 2010, 30 (1), 27–52. https://doi.org/10.1080/ 01443410903353294. 66

38. Bangert-Drowns, R. L.; Hurley, M. M.; Wilkinson, B. The Effects of School-Based Writing-toLearn Interventions on Academic Achievement: A Meta-Analysis. Rev. Educ. Res. 2004, 74 (1), 29–58. https://doi.org/10.3102/00346543074001029. 39. Ramirez, G.; Beilock, S. L. Writing About Testing Worries Boosts Exam Performance in the Classroom. Science 2011, 331 (6014), 211–213. https://doi.org/10.1126/science.1199427. 40. Park, D.; Ramirez, G.; Beilock, S. L. The Role of Expressive Writing in Math Anxiety. J. Exp. Psychol. Appl. 2014, 20 (2), 103–111. https://doi.org/10.1037/xap0000013. 41. Angelo, T. A.; Cross, K. P. Classroom Assessment Techniques: A Handbook for College Teachers, 2nd ed. Jossey Bass Inc.: San Francisco, CA, 1993.

67

Chapter 4

Improving First-Semester General Chemistry Student Success Through Retrieval Practice Saul R. Trevino,*,1 Elizabeth Trevino,2 and Mary Osterloh1 1Department of Chemistry, Houston Baptist University, 7502 Fondren Rd., Houston, Texas 77074, United States 2Department of Special Populations, Houston Baptist University, 7502 Fondren Rd., Houston, Texas 77074, United States *E-mail: [email protected].

General Chemistry I students are often first-semester college students who have not developed independent learning skills. These students struggle to adapt to the rigors of post-secondary education, which frequently leads to them dropping out of college. Extant research has shown the benefits of retrieval practice, the practice of self-testing, for long-term retention and consolidation of learning material. In this paper, an approach for teaching chemistry students about retrieval practice is presented. In this approach, students experience retrieval practice while learning essential chemistry knowledge. Then they are guided to adopt the strategy as independent learners in the chemistry course. Preliminary data suggests that this approach can positively impact student success.

Introduction College student success is an important issue for both students and universities. Many students come to college lacking learning skills that will help them be successful, and unfortunately, many of them end up dropping out of college as a result. Retrieval practice, the practice of self-testing, has been shown to have a strong positive effect on learning and long-term retention (1, 2). In this paper, an approach for helping students experience the benefits of retrieval practice is presented. The hope is that by experiencing retrieval practice and its benefits, that students will be more likely to adopt the strategy during the time they spend learning outside the classroom.

Method The following approach is inspired from the method of Karpicke & Roediger (1), and it is modified to fit into a 50-minute class session. Data was collected on a total of 102 college students in four separate sections of a first-semester Chemistry (General Chemistry I) course at Houston Baptist University, an ethnically diverse private university of about 3500 total students (undergraduate and © 2019 American Chemical Society

graduate). One professor collected data in three of the sections and another professor collected data in the fourth section. Very early in the semester, a 50-minute class period (the second class period of the semester) or one hour at the beginning of lab (the second week of lab) was devoted to the following retrieval practice experience. This intervention was performed in a General Chemistry course, but we feel that it would be relevant for other chemistry courses also. Students were given a list of 11 polyatomic ions and 4 unit analysis tools to study for 5 minutes (Figure 1).

Figure 1. Sample of the study sheet containing 11 polyatomic ions and 4 unit analysis tools.

70

Students then took a test (Figure 2) on the 15 items for 5 minutes. The order of the 15 items on the test was scrambled compared to the order on the study sheet. The students exchanged papers with a partner and graded each other. As the students were grading, the professor pointed out examples of correct and incorrect answers to try to ensure accuracy of peer grading. For example, “SO4” is incorrect for sulfate because it lacks the charge of the ion.

Figure 2. Round 1 Test Sample. After the grading period, students received their test back. On their study sheet, students were asked to cross off the items they got right, and they were asked to take 4 minutes to study only the items they got wrong on the first test. Students then did another round of testing/grading on all 15 items in accordance with the experimental condition of Karpicke & Roediger (1) that yielded the best results for long-term retention (i.e., the condition where all items were repeatedly tested). The order of the 15 items on the second test was once again scrambled. 71

Students were given another study sheet and asked to cross off the items they got right on the Round 2 test. The students then studied only items they got wrong for the Round 3 test. Grading of the Round 3 test was done outside of class by the professor. One week later, the students were given an unexpected final exam as a measure of long-term retention. The students presumably hadn’t studied the items since the week before, and this experience was done early enough in the semester that none of the items were covered in class during the previous week. The professor graded the final exam and double-checked the grading of the Round 1 test.

Results and Discussion Retrieval Practice Experience and Long-Term Retention Table 1 shows typical scores from implementing the retrieval practice experience in a firstsemester General Chemistry section. Table 1. Retrieval practice data from four General Chemistry I sections Section

Average Test Score ± Std. Dev.

% Retaineda

Test 1

Test 3

Final

1 (n=44)

38.5 ± 23.9

75.3 ± 21.3

52.4 ± 29.1

67.6 ± 34.2

2 (n=15)

36.9 ± 24.3

77.8 ± 20.3

38.2 ± 30.9

45.8 ± 28.7

3 (n=21)

49.5 ± 28.6

77.8 ± 25.9

47.0 ± 31.6

61.4 ± 32.1

4 (n=22)

54.6 ± 28.7

93.0 ± 11.0

50.5 ± 35.6

52.5 ± 35.3

a % retained is calculated as Final Test Score/Round 3 Test Score *100

On average, student scores increased from the Round 1 Test to the Round 3 Test on the first day. Table 1 also shows scores for the Final Test that was given, unannounced, one week later. As an approximate measure of the amount of material retained after one week, the Final Test score was divided by the Round 3 Test score. The average % retention value ranged from 45.8% in section 2 (an 8AM lab section) to 67.6% in section 3 even after the students had not studied or had not been tested on the material for one week. The % retention values reported here do not quite reach the value of 80% reported by Karpicke & Roediger (1), but this could be due to different experimental conditions due to class-time restrictions and/or differences in the learning material. However, the % retention values reported here still suggest a significant amount of long-term retention caused by retrieval practice (aka the testing effect). The experience in section 4 of Table 1 was administered by a different professor in an effort to get a preliminary measure of the generalizability of the approach, and the results for that section fall in the range of the other three sections. As a frame of reference, in the Karpicke & Roediger (1) study, the experimental condition that involved repeated studying of all items but not repeated testing of all items (i.e., the condition that did not lead to long-term retention) yielded an average % retention value of 36%. Therefore, the results presented here provide further confirmation of the benefits of retrieval practice vs. studying for long-term retention. Retrieval Practice Experience Metacognition Follow-Up After the retrieval practice experience on the first day, the students were provided a document, which discussed the things that were done that day for them that they would have to do for themselves 72

if they wanted to adopt the learning strategy of retrieval practice (Figure 3). The document also mentioned the benefits of retrieval practice in an effort to promote buy-in to adopt this learning strategy. To motivate the students to learn the benefits and tasks of retrieval practice, the students were told that they would be quizzed on this material in the next class period. All this was done in an effort to help the students understand how to put this strategy into practice on their own.

Figure 3. Retrieval practice metacognition follow-up document. Limitations of the Study While we are confident in the benefits of helping students experience this strong metacognitive learning strategy, there are some limitations in the study that need to be addressed. One limitation compared to the Karpicke & Roediger (1) study was that we were not able to achieve 100% correct 73

answers in the initial rounds of testing. This limitation causes our “% Retained” values to only approximately represent a percentage of retention (i.e., long-term learning) after one week. Another limitation was the large standard deviations seen across all test scores. Yet another limitation is that we assumed that students did not review the material during the week between the retrieval practice experience and the final test. While the possibility of students reviewing material for a final test they were not expecting is highly unlikely, this assumption may be incorrect for some students. Additionally, it is not known how many students in the Fall 2017 semester actually adopted this strategy. As for the concluding remark in the last paragraph below that suggests that this intervention increased the student success rate in the course, it is important to mention that we do not have ACT/ SAT scores for students in the intervention compared to ACT/SAT scores of students in previous years. Therefore, the increased success rate observed could be due to better high school academic preparation. Finally, in the Karpicke & Roediger (1) study, the students studied a list of SwahiliEnglish word pairs that they had never seen before, and in this study, prior knowledge of polyatomic ions and unit analysis tools may have had an impact on the results. Future Work Future work might involve surveying the general sleep habits (duration and time of going to bed and waking up) and amount of sleep the previous night before doing the retrieval practice session. Sleep has been shown to play a critical role in promoting declarative memory (3). Future work also involves the use of an additional “Homework-as-Assessment (4)” in-class experience and an intervention that teaches students about Bloom’s Taxonomy (3) as discussed below.

Concluding Remarks In this paper, we report the implementation of an in-class retrieval practice experience for firstsemester General Chemistry students. The results suggest that a substantial amount of long-term retention occurred for the material in the experience. This intervention was designed to help students experience retrieval practice and to convince them of the benefits of retrieval practice so that they might adopt it as a strategy during their time spent learning material outside of class. Furthermore, knowledge of polyatomic ions and unit analysis tools is very important to success in many General Chemistry topics, so this experience doubled as a learning experience that helped the students in topics they learned later on in the semester. All of this was done in an effort to increase student success in General Chemistry I, and preliminary data suggests that it has. For example, about 75% of the students (35 of 47) who experienced this early-semester intervention passed a Fall 2017 General Chemistry I course which traditionally has a pass rate of 65%. Very similar exams were given in Fall 2017 compared to previous semesters. As mentioned in the “Limitations” section above, the lack of ACT/SAT scores for the Fall 2017 section compared to previous years is important to consider, but there were also several intangible benefits noticed by the professor. For example, student office visits appeared to be more productive where concrete “game plans” for success could be created. The professor was able to remind struggling students of “that second day of class where we did the retrieval practice experience” and then the mentoring of them on the process of adopting the strategy seemed easier to do since they had experienced it in class. Further development of this intervention, coupled with very helpful advice contained in Saundra McGuire’s “Teach Students How to Learn” (3) has further increased the pass rate to about 87% (33 of 38) in a Fall 2018 General Chemistry I course (manuscript in preparation).

74

References 1. 2.

3. 4.

Karpicke, J. D.; Roediger, H. L. The Critical Importance of Retrieval for Learning. Science 2008, 319, 966–968. Roediger, H. L.; Putnam, A. L.; Smith, M. A. Ten Benefits of Testing and Their Applications to Educational Practice. The Psychology of Learning and Motivation: Vol. 55. The Psychology of Learning and Motivation: Cognition in Education; Elsevier Academic Press: San Diego, CA, 2011. Lahl, O.; Wispel, C.; Willigens, B.; Pietrowsky, R. An ultra short episode of sleep is sufficient to promote declarative memory performance. J. Sleep Res. 2008, 17, 3–10. McGuire, S.Y.; McGuire, S. Teach Students How to Learn; Stylus Publishing, LLC.: Sterling, VA, 2015.

75

Chapter 5

Scaffolding Underprepared Students’ Learning in General Chemistry I: Approach and Assessment Suely Meth Black* Norfolk State University, 700 Park Avenue, Norfolk, Virginia 23504, United States *E-mail: [email protected].

The CHM 221 General Chemistry I course at Norfolk State University serves Biology, Computer Science, Mathematics, Physics, Technology, and Kinesiology majors. Students’ preparation for and interest in the courses vary broadly. Quite a few juggle school demands with work and family responsibilities. Over the past several years, Black has implemented various instructional approaches. Analysis of this input and students’ performance in the course, informed by the literature, has led to the design of a combination of instructional and assessment activities to address students’ lack of preparation in both content and study habits. This chapter describes the course structure, and how early assessment results correlate with course outcomes.

Introduction This paper contains the results of instructional enhancement efforts in CHM 221, the General Chemistry I course for science majors at Norfolk State University (NSU). Norfolk State University, a Master’s Level 1, is an urban Historically Black College and University with enrollment of approximately 5,000 students, 84% Black. NSU serves a large number of first-generation college students, with the majority qualifying for federal education grants. Entering freshman GPA has averaged 2.95, while SAT scores have hovered over 850 in the past five years. The College of Science, Engineering and Technology (CSET) has enrolled an average of 1,800 majors, and has conferred around 300 degrees a year over the past several years. The Chemistry Department offers four different General Chemistry courses and sequences to fulfill varying requirements of science, engineering and medical-field-related curricula. Students pursuing the following majors must successfully complete the first semester of the two-semester General Chemistry sequence (CHM 221): Computer Science, Electronics Technology, Kinesiotherapy, Mathematics, Medical Technology, and Physics. Biology majors and CSET honors students must also pass the second-semester course (CHM 222). Chemistry majors attend separate, 4-credit hour courses, CHM 223A and CHM 224A. This article describes instructional approaches, assessment methods and result analysis for seven sections of General Chemistry I, CHM 221. © 2019 American Chemical Society

For three years, the author taught over 50% of the course enrollment in CHM 221. The author intentionally infused the course with best-teaching practices to improve outcomes. The effort has yielded mixed results depending on the class composition, with pass rates ranging from 48% to 73%, similar to those attained by other course instructors. Teaching general chemistry for science majors at Norfolk State University presents a variety of challenges. Students come to the course with different levels of mathematical ability and confidence, and many have never taken a chemistry course before. The science reading and comprehension skills of some fall below the desired level, which is compounded by the need to learn the language of chemistry, leading to students’ difficulty in interpreting and solving word problems. Most have never engaged in course content that requires the mapping of mathematical, symbolic, macroscopic and microscopic representations, an essential skill for effective chemistry learning. The majority of students have poor study habits, reflected for instance in the low homework completion rates. Many support themselves, and/or family members. Instructors face the challenge of facilitating the students’ learning while assisting them to overcome these various handicaps. Freshmen through seniors enroll in the course, with some students still adjusting to college, while others only see the course as one of the final obstacles before graduation. A large number of students approach the course expecting to succeed by memorizing facts and being trained on how to solve every kind of problem. Some presume that by diligently attending class and following examples (or fellow students’ solutions) to complete homework will guarantee passing the course. Many also lack the skills to critically analyze their course performance, and to adjust their learning approach to fit the course demands. This scenario is not unique to NSU and large body of literature describes efforts for early detection and intervention to increase the likelihood of students’ success (1). A 2012 advisory group report to the U.S. President pointed out that comparatively larger numbers of freshman Hispanic and African American students do not complete their STEM degrees (2). This continues to be of concern as approximately 30% of the country’s population consists of underrepresented racial and ethnic individuals, but these groups contribute to only nine percent of the science and engineering workforce (3). As HBCUs play a major role in graduating AfricanAmerican STEM graduates (4), improvements in the General Chemistry I course outcomes at NSU will lead to increased racial diversity in the science workforce. The Committee on the Status, Contributions, and Future Directions of Discipline-Based Education Research (3) identifies problem solving, quantitative reasoning and modeling skills as necessary for successful careers in science. Meanwhile, the Presidential report cited above (2) strikingly concludes that the current science and engineering courses fall short of preparing the country’s technical workforce. Moreover, the report calls for the widespread adoption of empirically validated teaching practices to address the courses’ inadequacies. This article seeks to shed light on the impact and limitations of use of teaching best practices in the General Chemistry I course at NSU, with the goal of informing interventions to positively impact a large number of underrepresented minority science students, better preparing them to succeed in science careers.

Course Approach Through a design, implementation, assessment, reflection and re-design cycle, the author improved the CHM 221 course delivery over a three-year period. As recommended in “Visible Learning and the Science of How We Learn,” (VLSL) (5) the impact of instructional elements and their integration on students’ learning was analyzed, with the instructor playing the role of a change agent purposefully seeking to impact learning by adjusting her instructional approach. Using a large body of research results, VLSL provides practical advice on creating experiences to enable learning by 78

all learners. Intentionality in planning, with clearly established learning outcomes and visible criteria for success, should guide the course design and all activities. The goal of the instructor is to create trusting environments, and to facilitate students’ development of habits and practices that lead to mastery. Effective teaching entails providing learners ways to move from a starting position to the desired learning outcome (6). Therefore the CHM 221 course planning included layers of formative assessment to identify learners’ knowledge starting points and layers of summative assessment to establish if desired outcomes had been achieved. Assessment results guided the pace and content of course delivery, and the deployment of activities to allow for students’ deliberate practice and occasions for trial and error where immediate feedback and re-teaching occurred. Over the years, the author tried and tested various instructional and assessment components to scaffold students’ learning based on research-vetted instructional best practices. Below are details about course components, and how they contribute to course delivery and to students’ learning. How each component contributes to the grade is described at the end of this section. Pre-Course Formative Assessment Pre-Course Quiz Before the first day of classes students enrolled in CHM 221 receive an email welcoming them to the course, with an attached pre-course packet. The packet contains a document they must sign to certify that they have completed pre-requisites, a two-page pre-course high-school chemistry review worksheet, and a one page metacognition reflection. Examples of worksheet questions are “What makes up an atom? Draw an atom of nitrogen,” “What is the difference between an oxygen atom and an oxygen molecule?,” “What is more 10 kg or 10 g?,” “How many milliliters of water are there in a water-filled glass?” Metacognition questions include “How long did it take for you to complete the worksheet?”, “Was completing it super easy, easy, hard or extra hard?,” ‘Were the problems involving math more or less challenging than the conceptual ones?,” “ Did you have to look up a lot of the answers?,” “Was it easy to find the answers when you had to look them up?,” “Based on your work on the Pre-Course worksheet, and your answers above, how well do you think you are prepared for the course?,” “So now, what will you do to ensure you are successful in the course?” Teaching students to reflect on their learning (metacognition) and providing them practical strategies on how to study has been shown to lead to large improvements of students’ performance (7). The pre-course quiz aims primarily to identify students’ prior knowledge, ability to seek help, and commitment to preparing for an announced assessment. In-Class Group Solution of Fundamental Chemistry Multiple-Choice Questions During the first one or two days of the semester, the class solves a small set of simple multiplechoice questions. In CHM 221, questions include basic ideas about matter, such as classification of matter, composition of atoms and ions, reading the Periodic Table, etc. The instructor allows a short time for students to attempt to answer a question, followed by the instructor asking the whole class or a specific student to provide the answer. Starting the course with a group assessment has many advantages. First, it announces to the students that there is no “easing” into the semester. They immediately gauge their level of preparation for the course and how they stand compared to their classmates. Finally it provides the instructor a first pulse on the class: engagement and level of knowledge of the group and individual students.

79

In-Class Learning Instructor-Created Class Notes The instructor provides daily handouts called “Class Notes” for each class period. The Class Notes serve as the lesson plan for the day, and contain both information and guided activities that keep students engaged and reduce “copying” time. Class Notes are instructor-created handouts with summary of the most important facts for the day’s lesson, with strategic blanks to be filled out by students during discussions. Figure 1 shows a page of a Class Notes handout, with expected answers in italic. Class Notes use corresponds to a time-saving, effective strategy described in “Teaching and Learning STEM: A Practical Guide” (8). Class Notes facilitate in-class active learning, which has been shown to lead to a 55% increased student performance in science, engineering and mathematics courses when compared to lecturing (and slides) (9). In-class, guided active learning promotes deliberate practice with in-time feedback, facilitating students’ reflection on their level of understanding of the new topic before they leave the classroom. While students engage in Class Notes activities individually or as a group, the instructor walks around the classroom, encouraging, guiding, correcting, probing, returning graded assignments or distributing new ones. After each class, the instructor reflects on the classroom experience, and takes notes on how to enhance the lesson mini activities and content. The Class Notes have been revised over many semesters, and have proven to greatly facilitate learning based on students’ feedback and faster course pace, which allowed for greater content delivery without impact on students’ performance. Slides, Graphics, PhET Simulations, YouTube Videos For a few years, the author used PowerPoint slides with graphics and resources provided by the textbook publisher to complement the information in the Class Notes. Slides were phased out based on students’ feedback. Currently, the author projects the Class Notes on the screen, while students follow along. When appropriate the instructor shows pictures from the e-textbook, runs University of Colorado PhET simulations (10) and selected YouTube videos to complement Class Notes. PhET Interactive Simulations project offers free online math and science simulations that allow students to observe and virtually perform experiments to learn through exploration. Having the Class Notes on the screen helps students follow along, while references to the textbook, simulations and videos in class familiarize students with material they are expected to access out of the class.

80

Figure 1. Page 1 of Class Notes 4 on Chapter 4: Chemical Quantities and Aqueous Reactions. Answers are in italics. Students draw the particulate view of the dissolved species. 81

Formative Assessment Extra-Point “Pop” Quizzes Announced and unannounced “pop” quizzes take place in the first 5 minutes of the class. They have made an impact on students’ timely arrival and class attendance. Because they are worth extra points, there is no make-up. While students take the short assessment, the instructor can get set up. After the short pop quiz, students are alert and ready to engage with the course material. Clicker Questions Pop quizzes, on paper, are sometimes replaced by radio-frequency response system multiplechoice “clicker” questions (8) at the beginning of the class. Clicker questions are also interspersed in the Class Notes. Similarly to pop quizzes, they are extra points and get students attuned to the class. Because students’ anonymous responses are projected upon poll closing, clicker questions provide immediate assessment of students’ grasp of a concept, both to the instructor and to the students themselves. Online Homework (Sapling Learning (11), Mastering Chemistry (12)) Students must complete two or three sets of online homework assignments per chapter, each requiring one to two hours to complete. The assignments address concepts just covered in the previous one or two class meetings. These problems require students to make use of references as needed to complete from simple to complex problems. They include tutorials, use of simulations and multi-part problems. Late submissions received 10% penalty per day. Paper Homework Short paper homework, distributed in class and due the next meeting, reinforce and complement material discussed in the class. It contains simple problems and intend to encourage students to engage with the course material and prepare for the next class period, which will generally start with a pop quiz, clicker questions or a 30-minute chapter quiz. Out-of-Class Learning Scaffolding Voluntary Study Sessions with the Instructor or Student Tutor for Extra Points For two semesters, the author offered weekly study sessions to assist students learn the course material. During these sessions, the instructor or a student tutor led teams of students in the solution of a set of problems. In this model, the instructor or tutor still “controls” the activities, and students generally work on the same problem at the same time, with the faciliator solving the problem on the board as needed. Voluntary Learning Team Meetings Facilitated by the Instructor for Extra Points For two semesters, instructor-led study session became instructor-facilitated Learning Team meetings, an adaptation of the Peer-Led Team Learning, PLTL model (13–15). In traditional PLTL, students work in formalized groups outside regular classroom time to solve a set of problems. The group activity is facilitated by a student who performed well in the course in a previous semester (Peer Leader), who works closely with the instructor. Peer Leaders receive formal training on how 82

to facilitate the sessions to promote group work and students’ self-efficacy. PTLT meetings seek to improve undergraduates’ problem-solving skills through active learning, while teaching them how to study in groups (16). At NSU, the instructor, sometimes assisted by peer tutors, facilitated the activity, instead of a peer leader . During Learning Team meetings students worked in small groups to tackle together an assignment created to promote engagement with the course material while developing critical thinking. The first assignments’ design includes directions on how the team members work together. This approach stimulates interactions and encourages the participation of all team members. At the beginning of the semester, the instructor serves as a facilitator of the group member interactions, encouraging students to speak up and work cooperatively to obtain correct answers. Students are encouraged to go to the board and explain problems to one another. The instructors’ role is to guide the individual and groups of students in their learning, limiting instruction to a minimum. Later assignments have less directions on team members’ participation, as students become more self-driven and comfortable working as a group. The instructor recorded attendance at Learning Team meetings, and ensured students’ completed assignments accurately. Students who attended Learning Team meetings earned 10 bonus points on the assignment. Students who did not attend the meetings completed the assignments independently. LT assignments counted towards paper homework grade average. Instructor-Created YouTube Video Playlists Because of the limited in-class time available for repeated practice and even for introduction of some of the course material, YouTube playlists for each chapter were created and made available to students. Students appreciate having a list of videos that have been vetted by their instructor. Practice Tests Practice tests are posted on the course online management system so that students can familiarize themselves with the kinds and level of assessment they will find in the in-class tests. This practice reduces students’ stress and provides a good guide for their studying, as most students in General Chemistry have difficulty differentiating what is of fundamental importance and what is not. Course Portfolio Students prepare and submit a folder containing completed class notes, graded quizzes and homework assignments to the instructor on the day of a quiz or test. Quizzes and assignments must be corrected by the student in red pen after they have been graded and returned. At the beginning of the semester, students submit the portfolio on chapter and quiz days and receive feedback from the instructor on the quality of their work. After students understand what is expected, they only turn in their portfolios on test days. The instructor grades the portfolios based on completeness while students take the assessment in class, and return them on the same day. Summative Assessment Chapter Quizzes At the end of each chapter, a thirty-minute, free-answer quiz assesses students’ learning outcomes. Quizzes include problems requiring students to show their understanding of and to apply chapter information. They inform the improvement of instruction for the following semester, and results may lead to re-teaching of unclear concepts and skills. If over 50% of the students do not 83

perform well on the quiz, all students may choose to take it again at a scheduled time in the evening. Before retaking the quiz, they are required to correct the original quiz and complete a worksheet with more practice problems. The highest quiz grade is kept. Multi-Chapter Tests Three one-hour exams cover the content of three or four chapters each. They include a multiplechoice (assessing understanding and applying objectives) and free-answer (assessing applying, analyzing, and evaluating) components. Practice tests are available to help students prepare for the chapter test. 5th Week and 10th Week and Pre-Final Advisory Grades Average grades (homework, quizzes, tests) calculated at the end of the 5th and 10th week of class, named ADV1 and ADV2, provide students and instructor with a snapshot of performance in the class. Although many instructors complain about the extra burden of calculating one extra advisory grade, the author has seen positive impact of providing some students an early alert that their approach to the course needs serious rethinking with enough time for changes to make an impact on the course outcome. Students also receive their average in the course prior to the final exam (ADV3). ADV1 grade includes all grades through week five, weighed as shown in Table 1. ADV2 includes the ADV1 grade average weighed at 40%, and grades between week 5 and week 10 weighed at 60%. ADV3 is calculated by weighing ADV3 at 62.5% and grades between week 10 and the last day of classes at 37.5%. Chemistry Concept Inventory At the end of the course students take the Chemistry Concept Inventory test (17). The results assist the instructor identify gaps in students’ conceptual understanding of fundamental chemistry information. Cumulative Final Exam The final exam assesses remembering, understanding, applying and analyzing objectives in multiple-choice and free-answer questions. Students are exempt from taking the final exam if his/her average grade is an “A” prior to the final exam. Computation of grades, found on Table 1, was refined throughout the years to promote students’ engagement with course material in and out of the class and to offer chances for students to pass the course if their performance improves as the semester progresses. The prospect of exemption from final exam motivates students to complete all assigned and extra work. Only honors students achieve this very difficult feat. The final exam grade, weighed at 20%, is added to ADV3 weighed at 80%.

84

Table 1. Grade Computation - Student performance assessment. Asssessments prior to the final exam add up to 100% to facilitate computation of pre-final exam. The pre-final grade is weighed to 80% and added to the final grade weighed at 20% for those students who are not exempt. Advisory Period Assessments

Weight

Potential extra points

Paper homework

10%

Opportunities for make-up offered Learning Team work earns bonus points

Online Homework

10%

-10% per day late penalty, +10% bonus applied to online homework grade

Quizzes

20%

Up to 10 extra points from pop quizzes Up to 3 extra points from portfolio

Multi-chapter tests

60%

Up to 5 extra points from practice tests, worksheets Up to 3 extra points from portfolio Up to 3 extra points from clicker questions

Final Exam (added after all the other assessments are weighed to 80%)

20%

Up to 5 points from Chemistry Concept Inventory test Up to 8 points from SALG survey (group effort)

Closing the Loop The instructor asked for students’ opinions about the usefulness of various class components. Based on students’ feedback, Class Notes were improved, the use of slides was completely discontinued. Instead of slides, the instructor projects the Class Notes as she writes complementary information on the board. Students’ appreciate multiple opportunities for assessment and practice. The opportunity to practice for tests by solving problems similar to those students will encounter in tests has allowed for greater depth and breadth of test questions, and improved students’ self-efficacy. Likewise, allowing students to prepare and retake assessments they failed outside class meeting times is very popular, with 50% or more students showing improved performance on the retake. Retake exams cover similar material as the original exam (but different questions), and provide students an opportunity to identify, review and learn the concepts they did not master for the first exam. In response to the low online homework assignment completion rates, for one semester the instructor required only completion of paper homework. But the practice, limited by the fact that the instructor grades the papers herself, led to reduced opportunities for students who actually complete homework to engage with a larger variety of problems. Therefore online homework was again required in following semesters. Promoting students’ engagement with the textbook has been the greatest challenge. Improvements to textbook-integrated online homework systems, with links to the text from within the questions, seem to have made some positive changes in students’ behavior. Nevertheless, online homework completion rates continue to be low.

How Much Has All This Helped? Even though the instructor made improvements to the course based on students’ feedback, students’ rate of success have not consistently improved, remaining around 60% over the years. Comparative analysis of 1stAdvisory Grades (after five weeks of classes, ADV1), passing rates and results for the pre-course test (P-Test) may provide some insight for the lack of correspondence between course design and students’ outcomes. P-Test grades reflect students’ existing basic knowledge and/or good study habits. This seems to be a fair assumption as prior to the first day of 85

class students receive a worksheet and are directed to complete it for submission and use it to prepare for the test. At the 5th week of classes, ADV1 reflects students’ ability to adjust to the fast pace of the course and the need to consistenly function beyond the first two levels of Bloom’s Taxonomy. They must regularly complete homework, and prepare for in-class assessment. Students lacking the pre-requisite knowledge and skills must use outside support (tutors, learning team) to overcome this handicap early on. Figure 2 shows the correlation between students’ 1st Advisory Grades (ADV1) and students’ course outcomes.

Figure 2. Correlation, as percentage, between 140 students’ course outcomes (Passed, Failed/Dropped), Pre-Test Grades (A, B, C P-Test), and 1st Advisory Grades (A, B, C or D-F, on bars) for six CHM 221 sections. Interpretation of these data seems to lead to the following conclusions: 1) Students who earn A on ADV1 (left and lightest bar area) are very likely to pass the course; they start with good or very good chemistry and mathematics background (most earning A or B on P-Test), complete assignments regularly and are engaged with the course material from the beginning of the semester. 2) Students who earn a B on ADV1 are also very likely to pass the course, even though quite a few may not start with the requisite chemistry and mathematics background (wide distribution of P-Test grades), which seems to indicate that they are able to overcome their lack of preparation by taking advantage of the course supportive structure. 3) Students earning C-D on ADV1 are less likely to pass the course, even though many seem to have the pre-requisite knowledge to succeed (A or B P-Test grades). Most likely this results from students’ lack of capacity to develop the study habits needed to learn more complex chemistry concepts compared to those in high school. Absence during the first week of classes (No P-Test) may indicate students do not appreciate the need for full engagement, or the presence of external factors adversely affecting student’s commitment to the course. 86

4) Finally, an F grade on ADV1 prognosticates course failure with 100% certainty, regardless of students’ incoming preparation level. These students generally do not submit homework, fail to attend classes regularly and miss one or more in-class assessments. Many may have the pre-requisite knowledge to succeed but personal challenges, lack of motivation, or lack of needed study skills inhibit their course engagement. Table 2 below summarizes the effect of student attendance of Learning Team (LT) meetings on ADV1 and course pass rates since the inception of the practice. Four two-hour Learning Team sessions a week for eleven weeks were offered each semester. Students chose their meeting team, and voluntarily attended (for extra points) one or more meetings a week. Learning Team meetings start on the third week of classes, so students would have had a chance to attend three meetings prior to ADV1. The results on Table 2 show that almost 100% of students earning A or B on ADV1 regularly attended LT meetings throughout the semester, missing at most two. ADV1 C or D students attended meetings to a varying degree. No student who earned an F on ADV1 regularly attended LT meetings. Table 2. 5th Week Advisory Grade (ADV1) and course outcomes for students who regularly attended Learning Team meetings during two semesters. Grade of students who regularly attended LT meetings

Class Semester 1

Semester 1

Semester 2

A-B ADV1 Grade

100%

100%

82%

C-D ADV1 Grade

44%

50%

94%

F ADV1 Grade

0%

0%

0%

Pass Rate

92%

100%

89%

Even more telling are the results presented on Table 3, which show that the overall pass rate of the course corresponds exactly to the rate of regular participation in LT. Detailed analysis of the data shows that they do not correspond to the exact same students, but apparently attendance of LT meetings provides very good prediction of course pass rate. Learning team meeting participation seems to also have a positive effect on the persistence rate. The sense of belonging engendered by participation in LT meetings may explain students’ continued participation even when they have low hope of passing the course. Table 3. Effect of regular Learning Team meeting attendance on course pass rates and course persistence rates 2018 when Learning Teams were offered. Overall LT attendance rates and course pass rates

Class Semester 1

Semester 1

Semester 2

21

21

18

Learning Team Attendance rate

62%

48%

72%

Overall Pass rate

62%

48%

72%

Overall Persistence rate

95%

71%

94%

Class Size

87

Further Thoughts and Recommendations Although the author has tried to create an environment to support students’ learning in and out of the classroom, the General Chemistry I course pass rates have not improved much over the past years, averaging 60%. Students’ poor mathematics skills and deficient science literacy combined with low expectations for the effort required to succeed have remained the greatest challenges. Nevertheless, the analysis of the impact of various class components showed that interactions with the instructor in and out of class, structured by the use of Class Notes and Learning Team meetings are highly beneficial. Analysis of the effect of Learning Team participation shows large impact on persistence and a strong correlation of LT participation and course pass rates. The course has led some excellent students to change their majors to chemistry or to add chemistry as a minor. The final exam grade, weighed at 20%, is added to ADV3 weighed at 80%. Students’ outcomes during the first five weeks of class serve as excellent predictor of course success, and should inform the design of a different approach to identify students with high propensity to failure prior to their enrollment in the course. For a large number of students, offering the opportunity to develop the necessary knowledge, skills and habits prior to enrolling in General Chemistry I and providing support to address personal challenges seem critical to improve the course outcomes.

References 1.

2.

3.

4.

5. 6. 7.

8. 9.

Shields, S. P.; Hogrebe, M. C.; Spees, W. M.; Handlin, L. B.; Noelken, G. P.; Riley, J. M.; Frey, R. F. A transition program for underprepared students in general chemistry: Diagnosis, implementation, and evaluation. J. Chem. Ed. 2012, 89 (8), 995–10001. Olson, S.; Riordan, D. G. Engage to Excel: Producing One Million Additional College Graduates with Degrees in Science, Technology, Engineering, and Mathematics. Report to the President. Executive Office of the President, 2012. Singer, S. R.; Nielsen, N. R.; Schweingruber, H. A. Discipline-Based Education Research: Understanding and Improving Learning In Undergraduate Science and Engineering; The National Academy Press: Washington, DC, 2012. Espinosa, L. L.; McGuire, K.; Jackson, L. M. Minority Serving Institutions: America's Underutilized Resource for Strengthening the STEM Workforce; The National Academy Press: Washington, DC, 2019. Yates, G. C.; Hattie, J. Visible Learning and the Science of How We Learn; Routledge: New York, 2014. Yates, G. C.; Hattie, J. Visible Learning and the Science of How We Learn; Routledge: New York, 2014, pp xii. McGuire, S. Y. Teach Students How to Learn: Strategies You Can Incorporate into Any Course To Improve Student Metacognition, Study Skills, and Motivation; Stylus Publishing, LLC: Sterling, VA, 2015. Felder, R. M.; Brent, R. Teaching and Learning STEM: A Practical Guide; Jossey Bass: San Francisco, CA, 2016. Freeman, S.; Eddy, S. L.; McDonough, M.; Smith, M. K.; Okoroafor, N.; Jordt, H.; Wenderoth, M. P. Active learning increases student performance in science, engineering, and mathematics. Proc. Natl. Acad. Sci. USA 2014, 111 (23), 8410–8415. 88

10. PhET Interactive Simulations. https://phet.colorado.edu/ (accessed March 3, 2019). 11. MacMillan Learning Sapling Learning. https://www.macmillanlearning.com/Catalog/ elearningbrowsebymediatype/SaplingLearning/ (accessed April 21, 2019). 12. Pearson Mastering Chemistry. https://www.pearsonmylabandmastering.com/northamerica/ masteringchemistry/ (accessed April 21, 2019). 13. Hockings, S. C; DeAngelis, K. J.; Frey, R. F. Peer-led team learning in general chemistry: Implementation and evaluation. J. Chem. Ed. 2008, 85 (7), 990. 14. Gosser, D. K.; Roth, V. The workshop chemistry project: Peer-led team learning. J. Chem. Ed. 1998, 75, 185. 15. GosserD. K.; Cracolice, M.; Kampmeier, J. A.; Roth, V.; Strozak, V. S.; Varma-Nelson, P. Peer-led Team Learning: A Guidebook; Prentice Hall: Upper Saddle River, NJ, 2001. 16. Quitadamo, I. J.; Brahler, C. J.; Crouch, G. J. Peer-led team learning: A prospective method for increasing critical thinking in undergraduate science courses. Science Educator 2009, 18 (1), 29, 30. 17. Krause, S.; Birk, J.; Bauer, R.; Jenkins, B.; Pavelich, M. J. Development, testing, and application of a chemistry concept inventory. In 34th Annual Frontiers in Education, 2004; FIE 2004; pp T1G-1, IEEE.

89

Chapter 6

Adaptive Learning Technology in General Chemistry: Does It Support Student Success? Jessica M. Fautch* Department of Physical Sciences, York College of Pennsylvania, 441 Country Club Road, York, Pennsylvania 17403, United States *E-mail: [email protected].

At the national level, undergraduate programs in STEM are facing low enrollment and a high rate of attrition. In response to an institutional need for increased student support in introductory, or gateway, science courses, we aimed to close the gap for underprepared students in General Chemistry I. This course, populated mostly with freshman biology, chemistry, and engineering majors, was targeted as part of a larger institutional goal to enhance the persistence of students in STEM. Although the possibilities are abundant when considering improvements to a course or program, this initial effort targeted homework assignments in the course. In fall 2017 the adaptive homework system ALEKS was implemented in one section of general chemistry, while remaining sections utilized the historical department standard for online homework. Adaptive products are tailored to each student’s ability and provide a unique experience through an individualized learning path. In this pilot study, we investigated the impact of ALEKS on student learning through common exam questions, final course grades, and student perceptions. At the completion of this preliminary study, the department implemented ALEKS for all General Chemistry I sections to help close the achievement gap in the course. Ultimately, the support of students in this gateway course will contribute to the college-wide initiative to support all students in STEM.

Introduction Institutes of higher education across the nation face significant challenges in student retention, particularly in Science, Technology, Engineering, and Math (STEM) disciplines. Students in these majors have a much higher attrition rate compared to their non-STEM peers (1). On average, less than 50% of students who enter college declaring a STEM major will eventually earn a degree in a STEM major (1). A number of studies have reported attempts to determine why students change majors out of STEM fields (2–6). In the foundational study by Seymour and Hewitt (6), © 2019 American Chemical Society

they report students consistently cite poor teaching in STEM as one of the factors contributing to a change in major out of a STEM program. Research has shown that the most critical time for a student to stay on track toward a STEM degree completion is early: within the first two years of post-secondary education (3). During this time coursework typically involves significant math and foundational science courses, such as calculus and chemistry. These freshman-level courses, in many cases, have been labeled as weed out courses for those students who are not well-prepared to pursue a degree in science. However, given the challenges facing the STEM workforce (1, 7), many colleges and universities are maintaining a student-centered approach and are investing resources to helping students achieve success. When addressing the high attrition and low success rates in 100-level STEM courses, instructors and administrators may wish to take a comprehensive approach and consider changing multiple segments of STEM education. For example, active learning (8–10), peer-led team learning (11, 12), and the flipped class (13–15) have been hailed in recent years as prime examples of practices for improving student success in the science classroom. In addition to pedagogical approaches, a variety of programs and interventions which boast improved retention in STEM exist. Specifically, cohort building (16), residential learning communities (17–19), advising and mentoring (20, 21), and early exposure to research (22–24) are among those practices known to impact student success. Despite the challenges facing STEM majors in their first two years, institutions of higher education are now poised to embrace change (25) in order to drive student success. York College of Pennsylvania (YCP) is a private 4-year comprehensive college serving primarily undergraduate students in southeastern Pennsylvania with a typical undergraduate enrollment around 4,000 and an average class size of about 20 students. At YCP there is a strong emphasis on teaching excellence. The most populated majors are nursing, business, engineering, and biology. Science and engineering majors are required to complete General Chemistry I (CHM 134), which serves approximately 300 students each fall. The distribution of majors included in the current study is depicted in Figure 1. Nearly half of the course is populated by engineers, while a good portion of biology and undeclared majors also enroll in CHM 134. At YCP there is significant programming and support for undeclared majors, so many students begin college as undeclared with the intent to change majors into STEM, usually engineering or biology.

Figure 1. Fall 2017 enrollment in General Chemistry I (CHM 134) by major. A total of 279 students enrolled at the start of the semester, with engineering students comprising nearly half of the population. The chemistry program is currently housed within the physical sciences department which includes chemistry, forensic chemistry, math, and physics programs. Of particular interest to this department is the rate of student success, measured internally by DFW rates. The DFW rates capture 92

students who have earned a D, F or have withdrawn from the course after the initial add/drop period in week 1. As shown in Table 1, it is evident that CHM 134 is ready for improvements: the rate of DFWs in fall 2015 for CHM 134 was almost twice that of the entire department, and was more than three times that of the entire campus. The inability of students to successfully complete this course is both a department and college-wide concern, especially given the relatively large number of students this course serves each fall. Table 1. DFW rates for CHM 134 compared to the department and college CHM 134

Phys. Sci. Department

YCP campus

Fall 2015

23%

13%

7%

Fall 2016

18%

12%

7%

The high rate of failure and withdrawal in our general chemistry course is not out of the ordinary. Many colleges and universities throughout the nation face similar (or higher) rates of attrition in general chemistry, partially due to poor teaching in STEM (6). Although armed with this knowledge, it can remain difficult to enact pedagogical change at the course (or instructor) level for the purpose of student success. Additionally, the availability of electronic teaching tools, new technology, peer-topeer initiatives, active learning activities, and other interventions can become overwhelming. In this study, one relatively simple change to the online homework was made in one section of the course and the impact on student learning was investigated. Although the present study was a pilot study, the underlying goal was to improve student performance and success in the gateway course: General Chemistry I.

Pilot Study Adaptive Learning As an initial attempt to address our institutional goal of improving student success in 100level science courses, we implemented an adaptive homework system for one section of General Chemistry I (CHM 134) in the fall 2017 semester. Adaptive learning provides a tailored experience for each student and allows students to learn at their own pace. This autonomy provides a platform for motivated and talented students to skip ahead and avoid unnecessary or repetitive work, while also allowing less prepared students to learn basic and foundational knowledge (i.e. algebra) before attempting more complex applications (i.e. stoichiometry). Adaptive learning has been gaining traction in higher education as data on student learning is acquired and members of upper administration are learning more about its benefits (26, 27). Providing a personalized learning path for students, especially in courses that act as potential bottlenecks to graduation, is appealing to both administrators and faculty teaching these courses. Specifically, introductory math and science courses with low rates of success can benefit from implementing adaptive learning methods to assist students. Although a number of homework platforms exist for general chemistry, there are limited programs that provide a truly adaptive experience. Some software may provide learning modules for students to review after completion of a problem set and a brief formative assessment. This does not preclude the student from forgetting the entire unit after moving on to the next. Additionally, homework which does not allow students to move on to new questions before answering current questions does not qualify as adaptive. These products are not learning from the student and adjusting—they simply block students from moving 93

on with minimal student remediation. Despite the well-meaning intent of providing follow-up assignments, very few products offer a fully adaptive experience. After an investigation into a variety of adaptive homework options for general chemistry, we selected ALEKS: Assessment and LEarning in Knowledge Spaces (28). Student Population The pilot study in the fall 2017 semester involved one section of CHM 134 utilizing ALEKS while the remaining nine sections of the course utilized the pre-existing online homework system. The non-ALEKS group had comparable assignment due dates using digital homework that was nonadaptive. Although data were collected from the entire department, not all faculty members adhered to the procedure for data collection and/or did not collect all data as described herein. Therefore, each analysis and comparison discussed in this report includes the ALEKS section (n=28) and a variety of non-ALEKS groups. In each particular case presented herein, the non-ALEKS group will be described briefly. Table 2 provides the distribution of majors enrolled in the course for the ALEKS group and the entire remaining non-ALEKS group. Within the non-ALEKS group, four individual sections with similar populations (i.e. greater than 50% engineering) were identified. These students (n=113) did not display notable differences from the entire non-ALEKS group, therefore the full non-ALEKS population was used as the comparison group to the ALEKS students. At our institution, allied health majors taking this course include medical laboratory science, nuclear medicine, respiratory care, and radiography; nursing students enroll in a different chemistry course. The “other” majors typically include math, psychology, and occasionally business and fine arts. Table 2. Percentage of majors in the ALEKS and non-ALEKS groups enrolled in CHM 134 Allied Health

Biology

Chemistry

Engineering

Secondary Ed-Science

Undeclared

Other

ALEKS

7.1

14.3

0

53.6

3.6

10.7

10.7

NonALEKS

7.2

23.7

5.4

43.7

2.5

13.3

4.3

Data Collection During the pilot study a variety of information was collected using an IRB-approved process. First, a six-question content quiz administered through the learning management system (Moodle) was deployed at the start of the semester as a pre-quiz and at the end of the semester as a postquiz. The areas of content included (1) mole conversion (2) stoichiometry (3) aqueous reactions (4) frequency and energy of light (5) quantum numbers and (6) polarity of bonds. Students were awarded extra credit if they completed the quiz at either or both time points. Students were instructed to take the quiz each time with minimal help or outside involvement and the extra credit was awarded upon completion. Students were not informed that the post-quiz contained the same exact questions as the pre-quiz, but it is possible that some students could recall the questions from three months prior and thus be at an advantage for correct answers on the post-quiz. Students were not informed of the correct answers in either case. The second set of data collected was the success of students on six common exam questions. These questions were multiple choice format and were delivered across all sections on the four exams throughout the semester. These questions were distinctly different from those provided in the pre- and post-quiz. In addition to these content-related surveys, students

94

were also asked their opinion about their learning in pre- and post-surveys. Finally, as an overall comparison, the final grade distribution was compared between ALEKS and non-ALEKS students. Because the pilot study involved one section of ALEKS students, most students (~250) in fall 2017 did not utilize ALEKS and were working in the standard non-ALEKS program. Students enrolled in General Chemistry II (CHM 136) the following spring 2018 semester (n=75) continued homework in the non-ALEKS format, with one exception. One faculty member continued with ALEKS in spring 2018 in CHM 136 and thus offered an ALEKS experience to 34 students in general chemistry II. Given their unique homework experience (i.e. working with both ALEKS and nonALEKS) an opinion survey was administered to these 34 students at the end of the spring 2018 semester. The results of this survey are also presented in this study. ALEKS: Adaptive Experience After selecting an adaptive digital homework system (ALEKS) the framework for the homework within the online platform was developed. More information regarding case studies as well as the theory behind how the adaptive nature of ALEKS works can be found at www.aleks.com. The course within ALEKS was aligned to the existing textbook all students were required to purchase. The ALEKS course outcomes were arranged according to the chapter order of the book. Instead of specific end-of-chapter type problems, “topics” were selected and assigned. Multiple topics were placed within an “objective”, and objectives were assigned approximately once per week. The students were told that the objectives were their homework assignments—they had due dates. The instructor could see sample questions that might be asked on a particular topic, but each student had an individualized experience and may or may not have seen a particular question. At the beginning of their learning path each student completed a comprehensive knowledge check, which is a quiz to gauge the student’s prior knowledge in math and chemistry. This quiz set the baseline for each student such that a student with extensive knowledge arrived to the start of the ALEKS course with a number of topics (and possibly objectives) already complete. This tailoring allowed the more advanced student to skip problems that were already known. Arguably, the more important outcome of the knowledge check is that a baseline was also set for the less prepared student, and allowed that student to work at an individualized pace that adapted to their level of knowledge. For example, a student with foundationally weak algebra skills would not be given a unit conversion problem until they mastered problems involving the multiplication or division of fractions. As a student worked problems within ALEKS, the system determined that a student has learned a topic after he or she correctly answered three questions in a row about that topic. At any point in the learning process the student was allowed to request an explanation of the current problem, which was done without penalty to the student. If a student answered questions incorrectly, there was no penalty other than the streak of “three in a row correct” was reset. All students worked toward learning the topics within the objective by a given deadline. In addition to these weekly goals, students also worked to fill their “pie” with those topics as they mastered them. The weighting of the grade for the ALEKS homework was set to 50% weekly objective grades (completed topics) and 50% filling the pie grade. In order to earn 100% on “filling the pie”, students needed to fill at least 85% of the pie by the end of the semester. It should be noted that if a student missed a topic early on, or did not complete an objective by the due date, not all topics would “fill” the pie. The instructor felt it was reasonable to ask the students to complete most of the pie for full credit on “filling the pie”. Periodically, and at the instructor’s design, ALEKS deployed shorter knowledge checks that assessed mastery of topics previously learned. If a student had forgotten a previously learned topic, it was removed from their pie. The removal, however, was not meant to be permanent. Students 95

could earn back these topics with future knowledge checks and/or through working during “open pie” periods assigned by the instructor. Open pie periods allowed students to work forwards and backwards within ALEKS with the sole purpose of filling in the pie. In all cases, ALEKS would not allow a student to work on a topic they were not ready to learn. For example, if a student had not mastered balancing equations, the system would not allow the student to work on stoichiometry problems that required balancing first. Tailoring the learning experience for each student within the ALEKS courseware allowed for an adaptive and individualized learning path.

Comparison of Learning As described above, we looked at student learning and experience with respect to the homework module over the course of the fall 2017 semester. This pilot study was not intended to be comprehensive or statistically thorough, especially given the small sample sizes, lack of demographic information, and inconsistent data collection by individual participating faculty. However, much can be learned about the impact on students and their learning path by examining a few key components of their experience. Furthermore, as our institution strives to improve student success in 100-level courses, experimenting with new learning modalities and pedagogical approaches are key. Pre and Post Quiz Students were given a pre-quiz at the start of the semester and the same post-quiz at the end of the semester, with six content questions as well as five questions about their learning (see “Student Opinion”, below). The rate of success for four selected content questions is shown in Figure 2. Surveys were not distributed in all sections, due to instructor error. The data collected reflect information gathered from the ALEKS group (n=28) and five sections of CHM 134 (n=142). Although the number of students completing both pre- and post-surveys in each group was around 30-35% for each population (n=10 ALEKS, n=41 non-ALEKS), the collection of this information for the purpose of assessing the viability of using ALEKS for homework was useful.

Figure 2. Performance on chemistry content questions at the start (pre) and at the end of the semester (post). The ALEKS group is in pink/red (n=10) while the non-ALEKS group is in blue (n=41). 96

Table 3. The rate of success on common exam questions between ALEKS and non-ALEKS sections. The sample size is reported to the right of the success rate. The large variations in sample size for the non-ALEKS group are due to instructor error, as described in the text.

ALEKS Non-ALEKS

Q1: Moles

Q2: Stoichiometry

Q3: Aqueous reactions

Q4: Frequency and E

Q5: Quantum numbers

Q6: Bonding

45% (n=28) 50% (n= 231)

80% (n=25) 43% (n=229)

89% (n=27) 75% (n=127)

88% (n=26) 89% (n=179)

92% (n=26) 82% (n=182)

78% (n=18) 62% (n=124)

97

Figure 2 illustrates that the type of homework did not have an impact on student performance in all areas of chemistry. For the topics of significant digits and aqueous reactions an increase in student understanding was found from the beginning to the end of the semester for both groups. The fourth content area shown in Figure 2 is “identify molecule”. When given a choice between a molecule, atom, or ion, students in the ALEKS group could not identify the correct answer at the start of the semester, but these students improved by the end of the semester. Although not significant, it is interesting to note that the non-ALEKS group did not increase their knowledge in this area, with the number of correct responses actually seeing a slight decline by the end of the semester. Perhaps the most interesting finding is in the third question reported in Figure 2, regarding balancing equations. The question on the quiz was not a simple question, although it was multiple choice. It appears that the ALEKS group made a sizeable improvement in their understanding of how to balance equations, while the non-ALEKS group stayed almost flat. It appears that ALEKS was able to support students in the particular area of balancing equations, perhaps in a way that the other homework system was unable to accomplish. Common Exam Questions Over the course of the semester four exams were given in CHM 134. Each exam had one or two multiple choice questions that was common among all sections of the course (six questions in total). The rate of success on these questions (correct vs incorrect) was compared between the ALEKS and non-ALEKS sections and is summarized in Table 3. As with any assessment, not all students attempted each exam or answered the given question. Additionally, not all participating faculty remembered to place every question on their exams, which led to a variation (and reduction) in sample size as the semester progressed. Therefore, the student sampling is varied from question to question; sample sizes are given for each question in parentheses. When evaluating the performance of the two groups reported in Table 3, there appears to be no difference in success for the fourth question about frequency and energy, which could be categorized as a “plug and chug” problem, requiring very little critical analysis to complete correctly. In four of the five remaining questions (Q2, Q3, Q5, Q6), the ALEKS group had a higher percentage of correct answers. The most noticeable difference is in the second question, a stoichiometry problem, as students in the ALEKS group performed considerably better than the non-ALEKS group. It is interesting to note that the particular problem involved balancing an equation first, then completing a mole to mole conversion between reactants and products. The high level of success that the ALEKS students had on this specific question is supported by the pre- and post- quiz results pertaining to balancing equations, previously discussed in Figure 2. In that analysis, the ALEKS students had much higher success with balancing equations as compared to the non-ALEKS group. Student Opinion Surveys In addition to the six content questions that students completed during the pre- and post-quiz (Figure 2), they were also asked five questions about their learning and comfort level with math and chemistry. The average level of agreement is reported for four of these questions in Figure 3 (math skills not included). For the questions related to student confidence (i.e. Q1 and Q4) all students reported an increased level of agreement by the end of the semester. Both ALEKS and non-ALEKS populations were examined independently for differences in pre- and post-responses using an F-test. Although some questions revealed p-values less than 0.05, suggesting a difference in response from pre to post, the resulting F-test indicated a large amount of variance in the sampling and thus the 98

perceived difference was not considered significant. The only opinion that was significantly different between the pre-test and post-test was Q4 regarding confidence in problem solving. For the ALEKS population only, both the p-value (p=0.023) and the F-test indicated a significant difference, which is noted in Figure 3. Although no significant difference in opinion was found in Q1-Q3, it appears that the amount of growth in chemistry confidence for the ALEKS students was higher compared to the non-ALEKS group. Specifically, for the level of comfort explaining chemistry to others (Q1) the ALEKS group reported more than double the gains over the non-ALEKS group. A trend is clear for chemistry confidence: ALEKS students experienced more improvement in confidence than their non-ALEKS peers.

Figure 3. Self-reported survey responses on a scale of (1) strongly disagree to (7) strongly agree. The ALEKS group is depicted in pink/red (n=10) and non-ALEKS is in blue (n=41). The error bars indicate the standard error of the mean, while the difference in pre-post opinion is noted by the delta values to the right of the bars. For Q4, the ALEKS population had response that changed significantly.

In an effort to learn more about how students approach being in control of their learning, we asked them to rate their level of autonomy in their learning. Although the amount of growth in this category was not as significant as growth in student confidence, the ALEKS users reported a higher level of autonomy growth (Δ = 0.30) compared to no change at all in the non-ALEKS group (Δ = 0.0). Finally, we asked a question about the usefulness of homework. Students were asked in the presurvey whether they believed homework will benefit their learning and problem-solving in the course. After the semester was nearly complete (in the post-assessment) the students were asked whether the homework did help them with learning and problem-solving. It was surprising to see that both groups of students, by the end of the semester, felt the homework did not help them as much as they had predicted it would. Although the ALEKS group had a slightly decreased level of perceived homework value (Δ = -0.10), the non-ALEKS group experienced a more substantial decrease (Δ = -0.80). 99

Instructor Observations At the launch of this pilot study, the intent was to utilize an adaptive learning tool to help support all students on their learning path in general chemistry. Exactly how this objective would be reached through that adaptive tool was unknown. What came to light, both through the student opinion surveys (Figure 3) and instructor observations, was an impact on student confidence. Over the duration of this pilot study, the ALEKS instructor was able to gain valuable qualitative feedback from students through informal interactions. A few student comments are highlighted here: I have spent a significant amount of time using ALEKS…it seems to teach from the book, asks questions that are relevant to the exams, and builds my confidence in my knowledge of the course materials…If you have trouble on a question the program will switch you to another, easier problem, forcing you to take a break from the frustration, rebuilding your confidence on another topic. —Student 1, non-traditional student While waiting for the tutor I noticed a kid from another class working on [non-ALEKS online homework] and I told him I could help him…and I did! —Student 2, self-proclaimed “bad at chemistry” student Both students quoted above were female biology majors enrolled in the ALEKS group. Student 1 also had experience with the non-ALEKS digital homework for chemistry, so she knew what other homework options were out there. Student 2 came to the course declaring her historical inability to “do” chemistry and that she disliked it. When she recounted the story as quoted above, she was beaming with excitement. Student 2 finished the course strong and was proud of what she had accomplished in the class. The growth of these two students was substantial. As an adaptive technology, ALEKS has the ability to support all students in their learning, in a personalized way. Female and underrepresented minorities face a belief gap (7) and tend to feel as if they might not belong in science courses or science majors, adding to an erosion in confidence. To help close this gap, it is important for these populations of students to feel supported and confident that they can achieve success in foundational science courses such as general chemistry. Final Grade Distribution At the conclusion of the Fall 2017 semester, grade information was collected and compared between the ALEKS group (n=28) and the non-ALEKS group (n=243). The number of total students at the end of the semester (n=271) is slightly lower than the start (n=279, Figure 1) due to college withdrawals. The distribution of course grades is shown in Figure 4. At York College of Pennsylvania, a C is the minimum required grade to earn credit for course completion within a particular degree sequence. The grade of D will allow the student to earn institutional credit for the course but it will not count towards degree completion, nor will a D grade allow the student to move to the next course in the sequence. The DFW rate as recorded at the institutional level therefore reflects the total students who are not successful in a course in a given semester. One might argue that a “W” is not a measure of failure, rather, it indicates a student making an informed decision to avoid earning a D or F in a course, perhaps after changing majors, for example. Therefore, in Figure 4 the DFW grades are not grouped together but instead are shown with W and D/F reported separately.

100

Figure 4. Final grade distribution for ALEKS (n=28) and non-ALEKS (n=243) groups Fall 2017.

One notable difference between the ALEKS group and the non-ALEKS group is the percentage of high grades earned by the ALEKS students (50%). In addition to the higher rate of A/A-/B+ grades, the ALEKS students experienced a lower rate of D and F grades as compared to the nonALEKS students (7% and 20%, respectively). The shift toward the higher GPA could possibly be attributed to the slightly higher percentage of engineering majors in the ALEKS group (Table 2), as they historically tend to earn higher grades in chemistry compared to their biology peers at YCP. However, the ALEKS group did include a number of “other” majors such as fine art and psychology, which would suggest that the ALEKS population was approximately as heterogeneous as the remaining general chemistry population. In fact, when compared to other sections with a high engineering population (i.e. >50% engineers) the non-ALEKS group as a whole had nearly the same grade distribution as the non-ALEKS engineering group (data not shown). Finally, the ALEKS group did have a higher withdrawal percentage (7%) as compared to the non-ALEKS group (4%). However, the ALEKS withdrawals were comprised of two students—one of which was changing majors to a non-science focus, and the other had experienced health issues, fell behind early on, and decided to take the class again to get a fresh start. The withdrawal was a viable and smart option for these students when faced with a choice between a failing grade or a W. Student Experience: Comparison between ALEKS and Non-ALEKS Homework General Chemistry II (CHM 136) is not a course that engineering majors take at YCP and thus the enrollment for the second semester of general chemistry is typically much lower than enrollment in CHM 134. For the spring 2018 semester, which followed the semester of this pilot study, 75 students enrolled in CHM 136, 57% of which were biology majors. At the conclusion of the pilot study at the end of the fall 2017 semester, a faculty member that had not used ALEKS during the pilot offered to work with ALEKS in the spring 2018 semester in general chemistry II. The majority of the students in this CHM 136-ALEKS group were, in fact, students who had not previously used ALEKS. At the conclusion of the spring 2018 semester these students who experienced both ALEKS and non-ALEKS digital homework were asked to complete a survey administered through the learning management system (n=34). The students were asked whether ALEKS or non-ALEKS (or neither) was “better” at a particular task (i.e. preparing me for exams). As reported in Figure 5, students completing the survey (n=27) undeniably preferred ALEKS for their homework; they appreciated the excellent feedback, help with preparing for exams, and felt that ALEKS contributed more to their learning.

101

Figure 5. Student experiences with ALEKS and non-ALEKS online homework (n=27).

Limitations and Future Directions After this work was presented at the symposium for which this book is compiled, a similar study (29) was found to be published previously. In the previous study a non-ALEKS (non-adaptive) homework system was compared against ALEKS in general chemistry. However, the previous work focused largely on student performance related to a pre-course assignment. The authors conclude that additional studies are needed in order to evaluate the impact of an adaptive learning system on student learning. More findings in the area of adaptive learning effectiveness in chemistry are beginning to surface (30, 31) but the majority of these studies involve a large student population (i.e. large lecture sections) and the impact of ALEKS on smaller classes has yet to be established. Moreover, beneficial information can be gained from instructors using adaptive technologies in gateway courses. Some limitations of this current study are the small sample sizes and lack of demographic information. These limitations, however, did not impact the ability to gain meaningful data and qualitative information about an adaptive learning platform for supporting students in general chemistry. Collectively, the results from this pilot study were presented to the faculty in the chemistry program, resulting in a unanimous decision to move forward with ALEKS for the fall 2018 semester. As a program we are extremely hopeful that the implementation of adaptive learning through ALEKS will provide students with an individualized learning path to support their success in general chemistry

Conclusions This pilot study was initiated to gain insight on whether an adaptive learning system would impact student learning in a gateway science course (CHM 134: General Chemistry I). In gathering and analyzing the data, it became clear that ALEKS would become the superior method of completing homework in our general chemistry course. As discussed throughout this chapter, the ALEKS group performed as well as or better on common exam questions, and had improved content knowledge by the end of the semester. Furthermore, these students experienced significant gains in confidence with respect to understanding and explaining chemistry. Increased confidence in these students is likely correlated with increased success in this gateway science course, especially for female students. Those students who continued to the second semester of general chemistry reported a strong preference for ALEKS as a learning tool and appreciated many of the features of the system. 102

References 1. 2. 3.

4. 5.

6. 7. 8. 9. 10.

11. 12.

13. 14. 15.

16. 17.

PCAST. Engage to Excel: Producing One Million Additional College Graduates with Degrees in Science, Technology, Engineering, and Mathematics; PCAST: Washington, DC, 2012. Shedlosky-Shoemaker, R.; Fautch, J. M. Who Leaves, Who Stays? Psychological Predictors of Undergraduate Chemistry Students’ Persistence. J. Chem. Educ. 2015, 92, 408–414. Chen, X. STEM Attrition: College Students’ Paths Into and Out of STEM Fields; NCES 2014-001; National Center for Education Statistics, Institute of Education Sciences, U.S. Department of Education: Washington, DC, 2013. Burtner, J. The Use of Discriminant Analysis to Investigate the Influence of Non-Cognitive Factors on Engineering School Persistence. J. Eng. Educ. 2005, 94, 335–338. Huang, G., Taddese, N., Walter, E. Entry and Persistence of Women and Minorities in College Science and Engineering Education; NCES 2000-601; National Center for Education Statistics, U.S. Department of Education: Washington, DC, 2000. Seymour, E., Hewitt, N. M. Talking About Leaving: Why Undergraduates Leave the Sciences; Westview Press: Boulder, CO, 1997. White, E. S. State of STEM: Defining the Landscape to Determine High-Impact Pathways for the Future Workforce; STEMconnector: Washington, DC, 2018. Prince, M. Does Active Learning Work? A Review of the Research. J. Eng. Educ. 2004, 93, 223–31. Michael, J. Where’s the Evidence That Active Learning Works. Adv. Physiol. Educ. 2006, 30, 159–67. Freeman, S.; Eddy, S. L.; McDonough, M.; Smith, M. K.; Okoroafor, N.; Jordt, H.; Wenderoth, M. P. Active Learning Increases Student Performance in Science, Engineering, and Mathematics. Proc. Natl. Acad. Sci. USA 2014, 111, 8410–15. Crouch, C. H.; Mazur, E. Peer Instruction: Ten years of experience and results. Am. J. Phys. 2001, 69, 970–977. Davidson, N.; Major, C. H.; Michaelsen, L. K. Small-group learning in higher education—cooperative, collaborative, problem-based, and team-based learning: An introduction by the guest editors. J. Excellence Coll. Teach. 2014, 25, 1–6. Smith, J. D. Student attitudes toward flipping the general chemistry classroom. Chem. Educ. Res. Pract. 2013, 14, 607–614. Fautch, J. M. The flipped classroom for teaching organic chemistry in small classes: is it effective? Chem. Educ. Res. Pract. 2015, 16, 179–186. Deri, M. A.; McGregor, D.; Mills, P. Using Technology To Flip and Structure General Chemistry Courses at a Large Public University: Our Approach, Experience, and Outcomes. In Teaching and the Internet: The Application of Web Apps, Networking, and Online Tech for Chemistry Education; ACS Symposium Series 1270; American Chemical Society: Washington, DC, 2017; pp 75–97. Graham, M. J.; Frederick, J.; Byars-Winston, A.; Hunter, A-B.; Handelsman, J. Increasing Persistence of College Students in STEM. Science 2013, 341, 1455–1456. Inkelas, K. K.; Daver, Z. E.; Vogt, K. E.; Leonard, J. B. Living-learning programs and firstgeneration students’ academic and social transition to college. Res. Higher Educ. 2007, 48, 403–434. 103

18. Kuh, G. D., High-Impact Educational Practices. Excerpt from: High Impact Educational Practices: What They Are, Who Has Access to Them, and Why They Matter; American Association of Colleges and Universities (AAC&U): Washington, DC, 2008. https://www.aacu.org/leap/ hips. 19. Rocconi, L. M. The impact of learning communities on first year students’ growth and development in college. Res. High. Educ. 2010, 52, 178–193. 20. Hutson, B. L. The impact of appreciative advising-based university studies course on college student first-year experience. J. Appl. Res. Higher Educ. 2010, 2, 4–13. 21. Kendricks, K. D.; Nedunuri, K. V.; Arment, A. R. Minority student perceptions of the impact of mentoring to enhance academic performance in STEM disciplines. J. STEM Educ. 2013, 14, 38–46. 22. Nagda, B. A.; Gregerman, S. R.; Jonides, J.; Von Hippel, W.; Lerner, J. S. Undergraduate Student-Faculty Research Partnerships Affect Student Retention. Rev. Higher Educ. 1998, 22, 55–72. 23. Lopatto, D. Undergraduate research experiences support science career decisions and active learning. CBE Life Sci. Educ. 2007, 6, 297–306. 24. Russell, S. H.; Hancock, M. P.; McCullough, J. The pipeline. Benefits of undergraduate research experiences. Science 2007, 316, 548–549. 25. Handelsman, J. EDUCATION: Scientific Teaching. Science 2004, 304, 521–22. 26. Liu, M.; Kang, J.; Zou, W.; Hyeyeon, L.; Pan, Z.; Corliss, S. Using Data to Understand How to Better Design Adaptive Learning. Tech. Know. Learn. 2017, 22, 271–298. 27. Bryant, G. Learning to Adapt 2.0: The Evolution of Adaptive Learning in Higher Education; Tyton Partners: Boston, MA, 2016. 28. What is ALEKS? McGraw Hill Education. https://www.aleks.com/about_aleks (accessed Feb 26, 2019). 29. Eichler, J. F.; Peeples, J. Online Homework Put to the Test: A Report on the Impact of Two Online Learning Systems on Student Performance in General Chemistry. J. Chem. Educ. 2013, 90, 1137–1143. 30. Smith, P. Personalized Education Using Adaptive Learning Technology: One Size doesn’t have to fit all. Learning and Teaching in Action 2016, 12, 101–118. http://www.celt.mmu.ac.uk/ltia/ Vol12Iss1 (accessed Feb 26, 2019). 31. Richards-Babb, M.; Curtis, R.; Ratcliff, B.; Roy, A.; Mikalik, T. General Chemistry Student Attitudes and Success with Use of Online Homework: Traditional-Responsive versus AdaptiveResponsive. J. Chem. Educ. 2018, 95, 691–699.

104

Chapter 7

Introducing Components of Specifications Grading to a General Chemistry I Course Langdon J. Martin* Department of Chemistry and Physics, Warren Wilson College, P.O. Box 9000, Asheville, North Carolina 28815, United States *E-mail: [email protected].

A critical component in writing a syllabus is defining how final grades reflect student achievement. A method of grading described in Linda B. Nilson’s book Specifications Grading focuses not on points that a student earns, but on milestones or competencies that a student achieves. This method also improves grading efficiency by reducing the need to assign partial credit for incomplete or incorrect work. This method has been implemented in multiple sections of General Chemistry I. First, the course is divided into discreet units that are designated “Essential” or “Ordinary.” For each unit, students take a quiz that is graded Mastery/No Credit. Students must demonstrate Mastery on all essential units in order to pass the course, and their final grades are determined, in part, by the number of ordinary units for which they achieve Mastery. Importantly, students who earn a No Credit grade on a quiz are expected to attempt an alternate version of the quiz at a later date. This system provides motivation to revisit material that is not mastered, and encourages students to think critically about what they did and did not learn. The benefits and drawbacks of the specifications grading system as part of General Chemistry I are discussed here, including aspects of logistics, grading, and student success.

Introduction: Specifications Grading General Chemistry students comprise a variety of majors and career goals. Regardless of how course topics are curated and organized, it is each student’s choice of when and how to engage with the material that affects how well that individual learns—and retains—that material. When assembling a course, it is important to incentivize students to engage with the material in a way that promotes deep learning and long-term retention. Specifically, it is best for students to spend more time learning chemistry on a regular, daily basis, instead of “cramming” before an exam. A further, crucial, part of incentivizing regular study is teaching students to engage with chemistry in a metacognitive way: students should ask themselves how well they know something. © 2019 American Chemical Society

When students can identify what they do not grasp, they can more efficiently allocate their study time. A system of course design, called specifications grading, includes two key components to incentivize student success: the abandonment of partial credit and the requirement that students revisit unsuccessful work. By not offering partial credit, the instructor classifies all incorrect answers as simply “not correct”—and therefore worthy of revisiting. Students who revisit their preliminary missteps have an opportunity to deeply learn and retain the course topics. This chapter includes an overview of some of the main practices in specifications grading, particularly how this system has been implemented specifications grading in small (24–35 student) sections of General Chemistry I at Warren Wilson College. The Downsides of Partial Credit One of the most frustrating activities for instructors is the assignment of partial credit. It is typically the most time-consuming part of grading quizzes and exams. There is the difficulty of differentiating a variety of “half-correct” answers from “three-quarters correct” answers and “onethird correct” answers. There is the pressure to be fair—or least consistent—when assigning partial credit…and there may be added stress from doing the grading at odd hours. But, perhaps the single most disheartening thought around partial credit is that many students will not use the information it contains as a tool for deep learning. Too often, the assignment of partial credit feels like rewarding—and therefore reinforcing—mediocrity. In the traditional manner of exams, students often see no reason to attempt to figure out why they received partial credit. (The major exception being the cases when the student wants to argue that they should have been awarded slightly more partial credit!) For students to deeply learn the material, they must figure out where they went wrong and then attempt to adjust their understanding so that they avoid making the same mistake in the future. The specifications grading system is designed to reward mastery, and to demand that students face their misconceptions and work to correct them. The Specifications Grading Philosophy The book Specifications Grading, by Linda B. Nilson, details this practice across subjects (1). I was first introduced to the idea of specifications grading in Chemistry at the 2016 Biennial Conference of Chemical Education in Greely, CO at a talk given by Prof. Joshua Ring of Lenoir–Rhyne University (2). This system has also been described by Prof. W. Christopher Hollinsed of Howard University (3). Similar grading systems have been proposed by others (4–6). The structure of traditional and specifications-based courses fit well within the philosophy of backward design (7). In this regard, the transition is straightforward. The main difference between a traditionally graded course and a course that uses specifications grading is the type of assessment(s) by which students earn the majority of their final grade. Key differences between the traditional and specification systems of grading are described in Table 1. In courses that are graded in a traditional manner, exams are the most important student assessment. For example, a traditional General Chemistry I course could include five midterm exams and a comprehensive final. But, even with cumulative exams, students may have minimal incentive to revisit the questions they had previously gotten no or partial credit on.

106

Table 1. Comparison of Traditional Grading to Specifications Grading Traditional Grading

Specifications Grading

Familiarity

Students are familiar with the system.

Students require additional explanation of why this system is being implemented and how to approach it.

Course Structure and Summative Assessment

Course is divided into Objectives. Summative assessment of classroom work is primarily through large exams that comprise multiple objectives, given at regular intervals throughout the course.

Course is divided into Objectives. Summative assessment of classroom work is primarily through high-stakes quizzes that comprise a single objective, given soon after the objective has been covered.

Grading

Partial credit is awarded at the instructor’s discretion. Students have a single attempt at an exam.

Partial credit is not awarded on the highstakes quizzes; students only receive a passing grade on a quiz when they achieve a high threshold (e.g. 80%). Students have the opportunity to try a new version of quizzes for which they have not yet reached the threshold.

Final Grades

Final grades are calculated based on the Final grades are calculated based on the (weighted) percentages of points number of objectives for which students have students earn. demonstrated Mastery.

In a course with specifications grading, small units (typically comprising the material covered in one or two classes) give rise to high-stakes Unit Quizzes. Each quiz is graded without partial credit using a pass/fail-styled system. Midterm(s) and a Final Exam are still part of this system, but the single biggest factor in a student’s final grade is the number of Unit Quizzes on which the student demonstrates mastery, meaning they must revisit any unit on which they do not reach a certain threshold. One of the biggest appeals to specifications grading is the intentional abandonment of partial credit. When a student does not answer a problem correctly and completely, the best explanation is often that the student has not mastered the material. And, when one has not mastered the material, one should spend more time with the material—focusing on what one does not yet know—until mastery is achieved. With a clear definition of mastery and the opportunity to learn from unsuccessful attempts, students have a grade-based incentive to think deeply about their progress on each unit. What did they get correct on their first attempt, and what did they miss? What adjustments do they need to make? What practice do they need to achieve mastery? The specifications grading pedagogy codifies a level of standards that encourages this type of metacognitive thinking and deep learning that can lead to long-term retention of material. The High-Stakes Unit Quiz The Unit Quizzes in a specifications grading system come with higher stakes than what a student might normally imagine upon hearing the word “quiz.” For this reason, some instructors may refer to them some other way; Ring used “quest:” a portmanteau of quiz and test (2). There are five key tenets to designing the Unit Quiz assessments. 107

Unit Quizzes Must Be Short Unit Quizzes are given at the beginning of class, often every class period. Therefore, they must not take up too much time. Students are told that quizzes are designed to be finished in less than ten minutes, but they will have fifteen minutes to complete them. Some days it is necessary to use the 15minute mark as a cut-off, but some days everyone finishes and class can begin sooner. Unit Quizzes Must Be Some Version of Pass/Fail One option is to grade Unit Quizzes “Mastery” or “No Credit.” The use of “no credit,” as opposed to “fail,” both avoids the stigma that students often attach to the latter, and emphasizes the formative information that comes from the quiz: many—sometimes most—students need to make a second attempt on any given Unit Quiz, and “fail” implies a finality that is not the case. Potentially, a student could require two attempts for every Unit Quiz and still earn an A in the course. The Bar to Demonstrate Mastery on a Unit Quiz Should Be High A grade of Mastery is awarded when four of five questions are correct (typically, quizzes are one side of one page with five questions). Furthermore, each of the five questions must be sufficiently challenging. A multi-step dimensional analysis problem is generally sufficient for a quiz question, but fill-in-the-blank (or fill-in-the-table) styled problems should include multiple components—all of which should be correct. No Partial Credit Is Awarded on Unit Quizzes This is key: each question is either right or wrong. For the most part, this significantly expedites grading. To be sure, there are still judgment calls. For example, one might choose to overlook a calculator error (or a significant figure error) if a student has clearly set up the solution correctly. Ideally, graded quizzes are returned to students before the day is out to underscore their use for formative learning. There Are Multiple Opportunities for a Student to Demonstrate Mastery on a Unit Most students do not achieve Mastery on their first attempts, particularly in the first few weeks of the semester as they settle into the course (and an unfamiliar grading system). Thus, students have an additional opportunity—or opportunities—to study, review, meet with tutors, or talk to the instructor, and make a fresh attempt at demonstrating Mastery on each Unit Quiz. Herein lies a significant downside for the instructor: multiple quizzes must be composed for each unit, and which version is given—and when, and to whom—must be tracked meticulously. The instructor will eventually develop quite a library of quizzes over the course of teaching this way for multiple semesters, and this represents a significant investment of time and mental energy. A major payoff, though, is that this iterative retaking is a wonderful opportunity to teach students metacognitive strategies. During office hours, the instructor can ask a student which of the specific objectives from the unit correspond to the questions that she got wrong, and then discuss how to focus her energy on learning those specific goals. Most students do not earn Mastery on every unit, even after multiple attempts. Another benefit to this grading system is that it provides a sort of metacognitive guidance to the students who earn B’s and C’s: when students realize that they will not earn Mastery on every unit, they can focus their study, first on the Essential units, and then on units in which they’ve made good progress towards 108

mastery. In this way, every student who passes the course has reached a high level of achievement on a subset of the course learning goals. Ideally, this deeper level of understanding will translate into longer-term retention of their chemical knowledge and abilities. As an instructor, this outcome is far more satisfying than awarding a passing grade to someone who may have eked by on partial credit earned through a heavy reliance on short-term memory. The Specifications Gradebook and Syllabus The gradebook of an instructor using the specifications grading philosophy looks quite different than the gradebook of an instructor using the traditional system. Rather than an array of scores and percentages, there is an array of 1’s (which represent Masteries achieved) and 0’s. Final grades are calculated not as a weighted average, but as a sum: the more 1’s, (the more demonstrations of Mastery) the higher the final grade. Another benefit to the instructor is that, as long as student quiz scores are kept current on a Learning Management System such as Moodle, students need not pester the instructor as to what they should study in order to improve their grade. An additional benefit to the use of high-stakes Unit Quizzes is that certain units can be designated as “Essential”: failure to demonstrate Mastery on an Essential Unit will result in automatic failure of the course. Because of the scaffolded nature of General Chemistry, most of these Essential Units occur in the first month or so of the course. This can, of course, cause quite a bit of stress on students in the first few weeks, but they quickly realize the necessity of regular studying and staying on top of the material at hand. Many students who earn an F in General Chemistry do poorly early on but believe themselves capable of doing well late in the course to bring their grade up. Because Essential Units are clearly specified in the syllabus, it is much easier to make clear to failing students that they are better off deciding to take a non-punitive Withdrawal than have an F on their transcript. Finally, an important feature of the specifications grading system is that it is not necessary to switch to it all at once: it is well suited to being hybridized with a traditional system of weighted averages. In fact, I recommend that instructors implement the system only over the course of a few semesters, and see what the right balance is for them.

Implementation: A Hybridized Course Course Details Methods of specifications grading have been implemented in General Chemistry I (CHM 1160) at Warren Wilson College, a private Liberal Arts college just outside of Asheville, NC, with an enrollment of about 600 undergraduates. Sections with specifications grading enrolled between 24 and 35 students, comprising students majoring in Chemistry, Biochemistry, Biology, Conservation Biology, and Environmental Studies. Specifications grading has been implemented in General Chemistry I for three semesters: Spring 2018, Fall 2018, and Spring 2019. Currently, hybridized components of specification grading have been introduced into Organic I and Organic II courses as well, although these are not discussed herein. This chapter focuses primarily on the initial implementations in General Chemistry I.

109

Hybridizing Specifications Grading with Traditional Grading Defining Units In Spring 2018, General Chemistry I was adapted to a hybridized traditional–specifications course. The critical first step was to divide existing course objectives into 18 distinct Units, shown in Table 2. The first five units were designated Essential (called Essential Learning Outcomes or ELOs on the syllabus); the remaining thirteen were designated Ordinary (thus, Ordinary Learning Outcomes or OLOs on the syllabus). The material in one unit typically encompassed one to two class periods. Differentiating between an Essential and an Ordinary unit was difficult, and designations may vary between instructors. One guideline is that Essential units include content that is foundational for multiple future units. Some instructors do not like the moniker “Ordinary,” so something else such as “General” may be used (3). Scheduling Unit Quizzes For each unit, students took a Unit Quiz at the beginning of a regular class period on which they earned a grade of Mastery or No Credit. Unit Quizzes were either 5 or 10 questions. These were graded without partial credit, and a minimum score of 80% was required for Mastery. Students could take an Essential Unit Quiz up to four times, and an Ordinary Unit Quiz up to three times; each attempt would include new questions of equivalent difficulties. Managing quiz retakes is the most logistically challenging part of the specifications grading system. There should be at least three designated Retake Days built in to the course where students must attempt any unit quizzes on which they have not demonstrated Mastery. Because a specifications grading-based course has fewer exams than a traditionally graded course, days that were previously used for exams can be re-designated in this way. Other make-up times can be scheduled individually, such as during office hours, or built in to the schedule in other ways; the size of the course will significantly factor into this decision. It is critical that students be made aware that they must schedule outside retakes well in advance. Defining Final Grades Prior to implementing specifications grading into General Chemistry I, a traditional system of weighted grading was used. Exam scores comprised roughly 50% of a student’s final grade, laboratory work was another 25%, with the remaining 25% earned through in-class quizzes, online homework, and participation. The syllabus included essential outcomes such as a minimum percentage in the laboratory and a maximum number of absences. Total percentages were translated into a letter grade, as shown in Table 3. For the Spring 2018 hybridized course, the number of units on which a student earned Mastery did not contribute directly to a student’s final grade, but rather served as a grade ceiling: a student needed to achieve Mastery on a minimum number of Ordinary Unit Quizzes (and all of the Essential Unit Quizzes), to earn the appropriate grade, as in Table 3. Since that semester, the course has been adapted to a more complete specifications grading system, also shown in Table 3. The rightmost column in Table 3 includes the grading scale from the most recent (Spring 2019) syllabus. Other excerpts from this syllabus that are specific to the incorporation of specifications grading are included in Figure 1. 110

Table 2. General Chemistry I Units for Specifications Grading Spring 2018 1 Measurement and Dimensional Analysis

Atomic Structure

Fall 2018 1

Spring 2019 1

Measurement

Measurement

Dimensional Analysis

Dimensional Analysis

Atomic Structure

Atomic Structure

Ionic Compounds and Solubility Ionic Compounds and Solubility Monatomic Ionic Compounds Rules Rules Polyatomic Ions and Solubility Rules The Mole

The Mole

The Mole

Solutions and Dilutions

Solutions and Dilutions

Solutions and Dilutions

Introduction to Intermolecular Forces

Introduction to Intermolecular Forces

Introduction to Intermolecular Forces

Introduction to Thermodynamics

Introduction to Thermodynamics

Introduction to Thermodynamics

Balancing Equations and Precipitation Reactions

Balancing Equations and Precipitation Reactions

Balancing Equations Precipitation Reactions

Stoichiometry

Stoichiometry

Stoichiometry

Acid–Base Chemistry

Acid–Base Chemistry

Acid–Base Chemistry

Redox Chemistry

Redox Chemistry

Redox Chemistry

Thermochemistry

Thermochemistry

Thermochemistry

Electromagnetic energy

Electromagnetic energy

Electromagnetic energy

Colorimetric Analysis

Colorimetric Analysis

Colorimetric Analysis

Atomic Orbitals and Periodicity

Atomic Orbitals and Periodicity

Atomic Orbitals Periodicity

Covalent Bonding

Covalent Bonding

Covalent Bonding

Molecular Shape

Molecular Shape

Molecular Shape

Introduction to Organic Molecules Introduction to Organic Molecules Introduction to Organic Molecules and Reactions Introduction to Organic Reactions Introduction to Organic Reactions 1 Essential

(OLOs)

Learning Outcomes (ELOs) are shown in boldface; all others are Ordinary Learning Outcomes

111

Table 3. Defining Final Grades in General Chemistry I Using Traditional and Specifications Grading Grade

2016–2017

Spring 2018

Traditional grading

Hybrid specifications/ traditional grading 1,2

Fall 2018

Spring 2019

Specifications grading 1,2 Specifications grading 1,2

A

≥ 92.5%

≥ 92.5%

25 or 26 OLOs

29 or 30 OLOs

A–

≥ 90.0%

≥ 90.0% ≥ 12 OLO quizzes

23 or 24 OLOs

26 to 28 OLOs

B+

≥ 87.5%

≥ 87.5%

21 or 22 OLOs

24 or 25 OLOs

B

≥ 82.5%

≥ 82.5%

19 or 20 OLOs

21 to 23 OLOs

B–

≥ 80.0%

≥ 80.0% ≥ 10 OLO quizzes

17 or 18 OLOs

19 or 20 OLOs

C+

≥ 77.5%

≥ 77.5%

15 or 16 OLOs

17 or 18 OLOs

C

≥ 72.5%

≥ 72.5%

13 or 14 OLOs

14 to 16 OLOs

C–

≥ 70.0%

≥ 70.0% ≥ 7 OLO quizzes

11 or 12 OLOs

12 or 13 OLOs

D

≥ 60.0%

≥ 60.0% ≥ 4 OLO quizzes and all 5 ELO quizzes

7 to 10 OLOs 12 ELOs

7 to 11 OLOs 12 ELOs

F