Redesigning Courses for Online Delivery : Design, Interaction, Media and Evaluation 9781781906910, 9781781906903

This volume examines key considerations for effective online course redesign. Using a 4-phase approach, Redesigning Cour

172 104 2MB

English Pages 172 Year 2013

Report DMCA / Copyright

DOWNLOAD FILE

Polecaj historie

Redesigning Courses for Online Delivery : Design, Interaction, Media and Evaluation
 9781781906910, 9781781906903

Citation preview

REDESIGNING COURSES FOR ONLINE DELIVERY: DESIGN, INTERACTION, MEDIA, & EVALUATION

CUTTING-EDGE TECHNOLOGIES IN HIGHER EDUCATION Series Editor: Charles Wankel Recent Volumes: Volume 5:

Misbehavior Online in Higher Education Laura A. Wankel and Charles Wankel

Edited by

Volume 6A: Increasing Student Engagement and Retention using Online Learning Activities: Wikis, Blogs and Webquests Edited by Charles Wankel and Patrick Blessinger Volume 6B: Increasing Student Engagement and Retention Using Social Technologies: Facebook, E-Portfolios and Other Social Networking Services Edited by Laura A. Wankel and Patrick Blessinger Volume 6C: Increasing Student Engagement and Retention Using Immersive Interfaces: Virtual Worlds, Gaming, and Simulation Edited by Charles Wankel and Patrick Blessinger Volume 6D: Increasing Student Engagement and Retention Using Mobile Applications: Smartphones, Skype and Texting Technologies Edited by Laura A. Wankel and Patrick Blessinger Volume 6E: Increasing Student Engagement and Retention Using Classroom Technologies: Classroom Response Systems and Mediated Discourse Technologies Edited by Charles Wankel and Patrick Blessinger Volume 6F: Increasing Student Engagement and Retention Using Multimedia Technologies: Video Annotation, Multimedia Applications, Videoconferencing and Transmedia Storytelling Edited by Laura A. Wankel and Patrick Blessinger Volume 6G: Increasing Student Engagement and Retention in E-Learning Environments: Web 2.0 and Blended Learning Technologies Edited by Charles Wankel and Patrick Blessinger Volume 7:

Digital Humanities: Current Perspectives, Practices and Research By Bryan Carter

CUTTING-EDGE TECHNOLOGIES IN HIGHER EDUCATION VOLUME 8

REDESIGNING COURSES FOR ONLINE DELIVERY: DESIGN, INTERACTION, MEDIA, & EVALUATION BY

ROBYN E. PARKER Plymouth State University, New Hampshire, USA

United Kingdom North America India Malaysia China

Japan

Emerald Group Publishing Limited Howard House, Wagon Lane, Bingley BD16 1WA, UK First edition 2013 Copyright r 2013 Emerald Group Publishing Limited Reprints and permission service Contact: [email protected] No part of this book may be reproduced, stored in a retrieval system, transmitted in any form or by any means electronic, mechanical, photocopying, recording or otherwise without either the prior written permission of the publisher or a licence permitting restricted copying issued in the UK by The Copyright Licensing Agency and in the USA by The Copyright Clearance Center. Any opinions expressed in the chapters are those of the authors. Whilst Emerald makes every effort to ensure the quality and accuracy of its content, Emerald makes no representation implied or otherwise, as to the chapters’ suitability and application and disclaims any warranties, express or implied, to their use. British Library Cataloguing in Publication Data A catalogue record for this book is available from the British Library ISBN: 978-1-78190-690-3 ISSN: 2044-9968 (Series)

ISOQAR certified Management System, awarded to Emerald for adherence to Environmental standard ISO 14001:2004. Certificate Number 1985 ISO 14001

CONTENTS ACKNOWLEDGMENTS

vii

LIST OF FIGURES

ix

LIST OF TABLES

xi

1. COURSE REDESIGN USING THE DIME MODEL

1

Introduction Online teaching and learning differs from the classroom DIME model for course redesign Need for a systematic approach to redesign Instructor as principal player in course redesign Accepting the course redesign challenge Conclusion References

1 2 4 7 9 10 11 11

2. METAPHOR AS A FRAME FOR COURSE REDESIGN 13 Introduction Authentic redesign examples Mental models are not always conscious Metaphor is a frame and an idea generator Revisiting DIME as metaphor The revised course redesign model Conclusion References

13 14 16 18 23 24 25 26

3. DESIGN CONSIDERATIONS

27

Introduction Analyzing your target audience Determine objectives

28 28 33 v

vi

CONTENTS

Plan instructional methods Conclusion References

37 41 42

4. INTERACTION CONSIDERATIONS

45

Introduction Types of learner-centered interactions Metaphorical reflections of interaction Conclusion References

46 47 63 75 76

5. MEDIA CONSIDERATIONS Introduction Identifying media needs Finding media to meet needs Selecting media to fit previous redesign decisions Conclusion References

6. EVALUATION CONSIDERATIONS Introduction Evaluation in online courses Integrating evaluation of student learning Evaluating Instructor Effectiveness and Course Quality Conclusion References

7. REPRISE AND FURTHER CONSIDERATIONS Introduction Metaphor revisited Highlights of the DIME redesign model: a reprise Further considerations Conclusion References

ABOUT THE AUTHOR

81 81 83 89 100 109 110

113 113 116 121 131 137 138

143 143 144 145 151 156 157

159

ACKNOWLEDGMENTS The writing of this book has been a team effort. It simply would not have been possible without the support of my university colleagues at Plymouth State University, who graciously picked up the service load to provide me with time to write. My thanks to Trent Boggess, dean of the College of Business Administration, for his instrumental assistance, in the form of graduate assistant Valer Suteu, and his professional support. My gratitude also goes to Emerald Insight’s Chris Hart, contracting editor, Sharon Parkinson, publisher, and Sarah Baxter, managing editor as well as to Charles Wankel of St. John’s University, series editor. Thanks also go to those who have helped shape my thinking and helped me to develop a concept for course redesign that I’m excited to share. Redesigning Courses for Online Delivery: Design, Interaction, Media, & Evaluation is the product of years of reading, discussing, reflecting, and experimenting. Perhaps I am most indebted to my students, who freely shared with me when things worked and when they didn’t. There are several other people who were directly involved in this project. Thanks to colleague, Professor Deborah Brownstein for her invaluable insights on metaphor and design thinking. Heartfelt appreciation goes out to Sheryl Hansen, director, Ohio Board of Regents, eTech Ohio, for a dozen years of support and her guiding feedback on the DeSIGN concept. Appreciation goes out to long time collaborator and colleague Professor Albert (Chip) Ingram of Kent State University. Chip’s input to the interaction chapter was an organizing force, putting order to chaos. Enduring gratitude to my loving family, without whom I simply wouldn’t have completed this work. Thanks to my daughter Lauren, who used her newly minted degree in creative writing to keep her mother from mixing too many metaphors, Katherine, for her patience and understanding, and Meridith and Rachel, for their expressions of joy and support for this project. Finally, this project would not have happened without my loving husband and collaborative partner, Steven Bardus. Steve was graphic designer, page editor, table maker, and all around go-to guy. Many, many thanks for his help in bringing this project to fruition. vii

LIST OF FIGURES Fig. 1.1 Four-Phase Course Redesign Model: DIME . . . . . . . . . 5 Fig. 2.1 Metaphor Reflects Your Views about the Content, Learners, and Instructor and How They Fit Together . . . . . . . . . 23 Fig. 2.2 Course Redesign Process, Featuring Metaphor as the Lens through Which the Remaining Elements Are Viewed . . . . 25 Fig. 4.1 Nature of Online Learner Instructor Interactions . . . . . . 58 Fig. 4.2 Playground Metaphor . . . . . . . . . . . . . . . . . . . . 65 Fig. 4.3 Baseball Metaphor . . . . . . . . . . . . . . . . . . . . . . 68 Fig. 4.4 Symphony Metaphor . . . . . . . . . . . . . . . . . . . . 72 Fig. 5.1 Course Diagram of Learning Activities: Playground Metaphor . . . . . . . . . . . . . . . . . . . . . . . . . . 103 Fig. 5.2 Course Diagram of Learning Activities: Symphony Metaphor . . . . . . . . . . . . . . . . . . . . . . . . . . 107

ix

LIST OF TABLES Table 2.1 Table 2.2 Table 2.3 Table 2.4 Table 3.1 Table 3.2 Table 4.1 Table 4.2 Table 4.3 Table 5.1 Table 5.2 Table 5.3 Table 5.4 Table 5.5 Table 5.6 Table 5.7 Table 5.8 Table 5.9

Playground Metaphor . . . . . . . . . . . . . . . . . Marketplace Metaphor . . . . . . . . . . . . . . . . Safari Metaphor . . . . . . . . . . . . . . . . . . . . Archeological Dig Metaphor. . . . . . . . . . . . . . Revising Learning Objectives for Strength and Clarity . Plan Instructional Methods from Course Objectives . . Course Redesign Using the Playground Metaphor . . . Course Redesign Using the Baseball Metaphor . . . . Course Redesign Using the Symphony Metaphor . . . Media Enabled Course Activities . . . . . . . . . . . Course Activities by Interaction Type and Media Characteristics . . . . . . . . . . . . . . . . . . . . . Sample Tools to Enable Content-Interaction Activities Sample Tools to Enable Instructor Interaction Related Activities . . . . . . . . . . . . . . . . . . . . . . . . Sample Tools to Enable Peer-Interaction Related Activities . . . . . . . . . . . . . . . . . . . Revisiting Redesign Decisions for the Playground Metaphor . . . . . . . . . . . . . . . . . . . . . . . Course Activities and Media Needs: Playground. . . . Revisiting Redesign Decisions for the Symphony Metaphor . . . . . . . . . . . . . . . . . . . . . . . Course Activities and Media Needs: Symphony . . . .

xi

. . . . . . . . . .

20 21 21 22 36 38 66 70 73 84

. . . .

88 90

. .

94

. .

98

. . . . . . . . . .

. . 102 . . 104 . . 106 . . 108

CHAPTER 1 COURSE REDESIGN USING THE DIME MODEL

ABSTRACT Course redesign follows a four-stage process organized around key sets of considerations related to design, interaction, media, and evaluation. In this chapter, we introduce the DIME model of course redesign, a systematic approach to creating and implementing online experiences. We argue that new mental models are needed to move away from simply digitizing the in-class experience for online delivery. Online teaching and learning is unique and requires new approaches. The model puts technology in a supporting role, privileging pedagogy, and human interaction. The principal role of the instructor is explored. Keywords: Course redesign; online teaching and learning; online pedagogy; instructional technology; online education; redesign model

INTRODUCTION This volume is about change. Whether you are moving your class online because you find the prospect exciting, or simply because you’ve been asked to do so, you will need to rethink the way you teach. Successfully moving courses online involves more than digitizing what you’re presently doing in face-to-face classes; teaching and learning online requires a different approach. But, you may be wondering, how do I get started? All too often, we default to familiar methods, even when new ones are needed. Such was the case with a colleague I ran into as she hurried to class one 1

2

REDESIGNING COURSES FOR ONLINE DELIVERY

day.1 She held a camera and tripod in her hands. I nodded, “how’s it going?” “Great!” she answered, holding up her camera and smiling. “I’m putting together my online course for this summer!” My colleague planned to tape the lecture she was about to give in her face-to-face class and then load it into her online course shell for distance learners to watch. You may be wondering, why not? In response, I offer another example. A few years ago I attended a conference focused on teaching with technology. The luncheon keynote speaker had an emergency and wasn’t able to travel to attend the conference in person. So, the planners solved the problem by having the speaker give his speech from his location, projecting it on a huge screen hung in the hotel ballroom. The room was filled with a few hundred educators, eight to a table. We chatted throughout lunch. Then the keynote speech began and the audience kept right on talking! The speaker was virtually ignored. No one seemed to perceive this as rude. After all, the speaker wasn’t present, so he was unaware that his presentation had become little more than a media broadcast. For many in the audience, his presentation had been relegated to mere background noise. The planners had the right idea in not wanting to lose access to the keynote speaker, but they clearly used the wrong approach to deliver the content to their audience. Let’s face it; students frequently tune instructors out during live lectures. What’s likely to happen when we are not physically with them? Attempting to replicate the classroom experience online is futile. According to Carol Twigg, President and CEO of the National Center for Academic Transformation, it is also one of the biggest obstacles to true redesign. She argues, by trying to make online courses “as good as” their traditional counterparts, we make them the same, and trying to make them the same mostly leads to an inferior learning experience as things are mismatched (Twigg, 2002). A new approach is needed for online learning.

ONLINE TEACHING AND LEARNING DIFFERS FROM THE CLASSROOM Online teaching and learning can be equal to, better than, or worse than learning in the classroom; but it is not the same (see Bernard et al., 2004 for a meta-analysis comparing classroom and online contexts). It can be both meaningful and satisfying for learners, as well as for instructors, but we

Course Redesign Using the DIME Model

3

need a new framework from which to work. Without one, we tend to work from what we know. We deliver buckets of content using digitized versions of our classroom lectures and have students post to “discussion boards” in an attempt to simulate discussion. Like the conference planners, it’s the right idea to want to share information with learners and encourage peer interaction, but it’s the wrong approach if your aim is to create a meaningful learning experience with long-lasting effects. Rather than trying to replicate the classroom, online, we want to design an experience that engages students in learning in a way that fosters their interest and curiosity, and ultimately facilitates deep-level learning that comes to them by way of technology. These are the outcomes that have been promised in the literature, observed by instructors, and experienced by some students, but unfortunately not all (Hill, Wiley, Nelson, & Han, 2004; Keeton, 2004).2 To create a unique course experience, we need a framework that will move us beyond the boxes that past experiences and university learning management systems create. The aim of this course redesign process is to help you think more reflectively about the design and delivery of a learning experience. The goal is to do more than merely convert courses for web delivery, but to redesign them and to craft the kind of experiences we want for our students and for ourselves, as instructors. This volume can help. In attending professional conferences and giving workshops on redesigning courses, colleagues have shared some of their reasons for resisting the move to online teaching and learning. Some instructors indicated they lack experience as online learners themselves; their only models have been of classroom instruction, so they simply don’t know what works. Others express a strong desire to “know” their learners, which they presume cannot be accomplished through online instruction. There are also those who voice trepidation over the technology learning curve. They see it as too steep for themselves, but flat for their technology-native learners, leaving them feeling outpaced. It is true that past models of instruction do not translate directly to online environments. It’s also true that our students are different. There is a growing body of research that suggests contemporary students’ minds are different due to their use of technology from a young age (Prensky, 2001). In his book, The Shallows: What the Internet is Doing to Our Brains, Nicholas Carr builds upon arguments first offered by Marshall McLuhan in Understanding Media. Carr (2011) argues that media alter not only what information we receive but also how we process that information

4

REDESIGNING COURSES FOR ONLINE DELIVERY

(see McCluhan, 1964 for the original). Carr quotes American scholar The Reverend William Ong’s thoughts on the transformative power of technology. “Technologies are not mere exterior aids but also interior transformations of consciousness, and never more than when they affect the word.” (p. 51) Technology is not neutral; it alters the way we structure our work and it will be deterministic (dogmatic; dictatorial) if we don’t apply it reflectively. Learners also bring expectations and past experiences with them to the online environment. The mental models students have may or may not be conducive to successful online learning. Although debates about whether learners are actually different today, and if so why, fall outside the scope of this chapter, they strongly suggest that a different approach to teaching and learning is possible and is needed. Whatever approach we take should take into consideration our content, our learners, and our own instructional style. Because technology can change radically in a relatively small space in time, this volume focuses on a set of stable considerations for redesigning courses that are steeped in pedagogy and instructional design. Decisions related to technology are not considered until after other key choices are made. This approach keeps technology in its place. To do otherwise is the equivalent of performing repairs on your car based solely on the tools you have in your toolbox. You might get lucky, but a better approach is to determine what repair is needed, and then go off in search of the tools with which to perform it.

DIME MODEL FOR COURSE REDESIGN DIME is an acronym we use for the course redesign process. This process centers on four major sets of considerations for redesigning courses for online teaching and learning. Each letter represents a different set of considerations, which are ordered in stages. The stages are presented independent from the next, but there is overlap; decision-making should be viewed as an iterative process. Choices made at one stage in the redesign process will inform decisions made in later stages. Decisions made in later stages may reveal information indicating the need to revisit previous decisions. We briefly explain the model here and then explore each set of considerations in a separate chapter in this volume. Fig. 1.1 depicts the model.

Course Redesign Using the DIME Model

Fig. 1.1.

5

Four-Phase Course Redesign Model: DIME.

Overview of Considerations in the Model DeSIGN The letter D stands for DeSIGN in the DIME model. This is the first of four sets of considerations for course redesign. It begins with traditional instructional design practices such as determining learning objectives and selecting instructional methods. In Chapter 3, we examine the steps in the instructional design process, focusing on writing strong objectives that will serve as the foundation of our redesigned course. In identifying instructional methods, we structure the decision-making process around four essential activities that instructional methods should facilitate in an online course: the sharing of information, the demonstration of skills, the ways learners will practice those skills, and the means to ensure learning has happened. These are decisions that will eventually inform our technology choices, but we won’t be ready to consider those choices until we determine the roles to be played by the instructor and peer learners in helping students achieve the learning objectives. These roles are at the center of the second set of considerations: interaction. Interaction The letter I stands for interaction. This critical set of considerations focuses on learner-centered interactions, which ultimately determine the look and

6

REDESIGNING COURSES FOR ONLINE DELIVERY

feel of online courses, influencing the way learners experience them. In Chapter 4, we consider three types of learner-centered interactions that were first proposed by Michael Moore (1989). Moore suggested that learners interact with course content, with the instructor, and with their peers. The role each of these types of interactions plays will vary depending upon the course. Which type of interaction is privileged in a given course will depend upon your content, your learners, and your instructional style. The three types of interactions are expanded upon and organized into more specific categories of interaction for further consideration. We explore how learners interact with content through the course structure and layout, and how they interact with peers who may be cast in the role of community members primarily providing support, or as information providers and/or collaborators. Instructor interactions serve to set expectations for learners and to facilitate learner interactions with content and peers. Interaction decisions strongly influence the character of the course experience. Interaction considerations are informed by decisions made previously during the design phase. They will play a significant role in the determination of media needs in the phase that follows. Media The letter M stands for media, through which course content and interaction flow. The DIME model uses the term media to refer to technology considerations because it represents the fundamental purpose of technology in online courses, to support or facilitate learner interaction with content, the instructor, and peers. In Chapter 5 we lay out a process by which media can be considered in relation to instructional and interactional goals and needs. We propose 10 media-enabled course activities that are used to guide the selection process. These activities are organized by instructional method, type of interaction, and required media characteristics. This approach brings specific features of media into focus, narrowing the scope of our decisions. Evaluation The letter E stands for evaluation. Evaluation is the process by which we determine if learning occurred. In Chapter 6, we examine opportunities and challenges in the assessment of online student learning, evaluation of instructor effectiveness, and assessment of course quality. We provide strategies for exploiting the advantages and mitigating difficulties. We explore formative as well as summative methods of evaluation. Formative evaluation helps in establishing how things should go at the start of the course.

Course Redesign Using the DIME Model

7

They can also assist in determining how well they are going while the course is still in session and there’s time to make changes. Summative methods help in figuring out how things went. Evaluation is a critical step that should be multifaceted and support continuous improvement efforts. The four phases of the DIME model provide a framework for making decisions related to course design. The model is both practical and scholarly in its approach. Although the suggestion that learning objectives should be set before choosing course technology is not new; the framework the model provides for the decision-making process is unique. This volume proposes a path that clears away the confusion and provides a new approach to redesign for the online environment.

Contextual Relative Approach to Redesign The four sets of considerations for course redesign are informed by a contextual relative approach. Contextual relativism is a term first coined by William Perry. He initially used it to describe the intellectual development process of undergraduate students in 1970. Essentially, the contextual relative approach involves attending to the setting or context within which development occurs (Perry, 1999). This approach assumes that answers to questions are relative to the situation or context. In course redesign, it is the decisions we make at each phase that will be situational. Choices should be made based upon their fit with your content, your learners, and your instructional style. There is not “one best way” to redesign an online course. Given a contextual relative approach, there are many valid ways to teach and learn online. The key is to achieve fit. Develop a vision for your course by examining each set of considerations in light of your content and learners. Then use technology to bring your vision to life. It may be easier than you think. It doesn’t take advanced technology skills; it takes attention to the considerations, some time to reflect, and a willingness to experiment. These are the skills you’ve built in the classroom and they do translate to course redesign!

NEED FOR A SYSTEMATIC APPROACH TO REDESIGN In order to bring about the needed changes to the teaching and learning process, we need a systematic approach to redesign that is driven by

8

REDESIGNING COURSES FOR ONLINE DELIVERY

pedagogy rather than technology. Naidu (2003) argued that it should no longer be necessary to reiterate that the media are just “the vehicles of the educational transaction.” To make the most of what technologies can do for you, Naidu suggests, focusing on the pedagogy first and then look for the tool. The trouble is that many of us fail to think reflectively when it comes to technology. Walk through the halls of any university in the United States and you’ll see rooms darkened, screens aglow, and instructors standing at the front of the room talking. Rather than using chalkboards, instructors now use PowerPoint; this seemingly small shift from notes on slate to notes on screen has changed the dynamics of classroom learning, but not necessarily for the better. Before technology, lecturing was a teacher-centric activity. The professor shared information, watched learners’ reactions, and then added clarifying examples by making notes on a chalkboard. Today, with the near-universal assimilation of PowerPoint into college classrooms, we have become slidecentric. Consider the scene; learners enter the classroom and settle into their seats; the instructor dims the lights. All eyes are immediately drawn to the contrasting element in the room, which is the lit screen displaying the first slide. Learners’ attention goes to the slides, rather than the instructor, who is in shadow. The professor becomes the aid to the slides, rather than the other way around. We haven’t consciously changed the way we teach, but things have changed nonetheless. We’ve trained our learners that anything important will be on the slides. Some learners no longer feel a need to listen. Their expectations are that information will be distilled into bullet points, and displayed on the screen for easy note-taking and memorization. Information going from screen to hand, with little if any pass through the brain, leads to surface level learning; learners work for grades instead of knowledge and know-how. They forget the content soon after the exam is over. We need a systematic approach to course redesign because we become stuck in doing what we’ve become accustomed to. For instance, we can make the technology tools do what we want, but many of us don’t; some of us don’t even know we can. How many PowerPoint users default to using the templates? You know the ones, the templates that set all of the information out in the bullet points. This may not be the best way to enhance the information being shared verbally with learners. It does little to stimulate learner thinking, something needed for learning to happen. The templates are there for convenience; we don’t have to use them, but we do, usually without considering other approaches. Using templates

Course Redesign Using the DIME Model

9

tends to be the default approach, imposing a linear structure on information, whether appropriate or not. The changes needed are not about more or different technology; they are about reasserting the importance and impact of pedagogy and subordinating the tools to a support role. To do this, we need a process. The DIME model steers you through the course redesign process, using research to guide thinking and practical strategies to guide choices.

INSTRUCTOR AS PRINCIPAL PLAYER IN COURSE REDESIGN There is much chatter in the news, both mainstream and in the blogosphere, about the role of online courses and technology in educational reform.3 The excitement over MOOC’s (Massive Open Online Courses), and how they might help in the management of educational costs and increased access, seems to be eliciting a bifurcated response from education stakeholders. Are MOOC’s a panacea or a threat?4 Whatever your view, MOOC’s are changing the conversations taking place within higher education. But course redesign isn’t about creating or not creating MOOC’s. You could create one using the DIME model; the model makes no assumptions about higher or lower enrollments being superior. MOOC’s could increase efficiencies in some courses similar to the way large lecture courses do so in face-to-face contexts. But, just as on campus, there are courses and learners that will require a higher touch approach, making large enrollments impractical and/or inappropriate. The contextual relative approach of the DIME model leaves room for considering redesign elements in light of particular situations. What really happens when courses move online? Those who are fearful argue the experience becomes depersonalized, some even proclaim it spells the end of the professoriate, as we know it (see Kompf, 2001 for example). Those who are enthusiastic argue the opposite. They say online learning can be more readily adapted to student performance, personalizing the experience (see Langmead, 2013 for example). Neither set of arguments seems to get to the heart of what it is for the instructor to teach online. For the most part, courses are not “no-man” systems within which learners go it alone, without an instructor facilitating activities. The role of the instructor shifts from lecturer to facilitator and designer, but the instructor remains a principal player in education. Online learners need the instructor

10

REDESIGNING COURSES FOR ONLINE DELIVERY

to guide them, interpret the materials, support their process, and communicate that they care. According to Kim and Bonk (2006), in a survey of 562 college instructors and administrators, all of whom were members of two associations for online education (MERLOT and WCET), 66% reported the most needed skills for instructors in the future will be moderating and facilitating online courses and developing or planning for high quality online courses. Course content is important, but learning requires interaction, at least in most situations. Arguably, that’s why MOOC’s have a 90% attrition rate (Kolowich, 2013; Parr, 2013). MOOC’s can work for students that are highly interested in the content, but the extremely high attrition rates indicate the average learner needs more than content. Students need the instructor, in the course with them, to help them connect the dots and put meaning to the learning activities.

ACCEPTING THE COURSE REDESIGN CHALLENGE It helps if you’re coming to the course redesign process out of a sense of excitement. As you create your course, you may find you are able to do things online that you were not able to accomplish in the face-to-face classroom. In one example from my own experience, I found learners much more willing to accept and act authentically in roles as business consultants when enrolled in online courses vs. in traditional classroom environments. In live classes, learners couldn’t get past the physical context, which cast them as students sitting at desks in a classroom. The online environment could be shaped in a way that learners felt they were in an office. They felt like professionals, and they behaved that way, readily engaging information and using it to make solid recommendations to simulated clients. Even if you’re not excited about the prospects of course redesign, this volume may be of help. It is grounded in research, both primary and secondary, and rooted in experience. It provides a framework for decisionmaking that is step-by-step in approach, so it is easy to implement. To fully employ the redesign process, we will need one more tool. We need a lens through which to view the process holistically. In the next chapter, we explore the power of metaphor in helping us break away from default patterns of thinking, inspiring new approaches to teaching and learning. Metaphor is a powerful change agent when applied to course redesign. Through it, we crystallize our ideas about our content, our learners, and

Course Redesign Using the DIME Model

11

our instructional style, and how we see the three fitting together. Metaphor helps by providing the lens through which to view the four sets of considerations the DIME model proposes.

CONCLUSION Teaching online is about change. It is not just a change in course context; it is more than shifting courses from the classroom to online. It is about creating a completely different experience that requires redesigning what you’re doing presently; that’s the exciting part. As you read this volume, you’re encouraged to apply the four sets of considerations to the redesign of your own course. Do this and you just may find yourself getting excited by all the possibilities. There is no single redesign path; the DIME model provides a roadmap by which you can arrive at the destination you set for your content, your learners, and yourself. Turn the page and together we’ll begin a journey.

NOTES 1. Examples are real, but locations and participants names withheld to protect privacy. 2. Visit the No Significant Difference phenomena website to search and review empirical studies comparing online and classroom learning, from the 1940s going forward, representing a wide range of views on the subject. Collectively, they seem to indicate the quality of the experience is not about the delivery system, per se, but how course elements fit the content, learner, and instructor. Available at http:// www.nosignificantdifference.org 3. The International Journal of Educational Reform explores the issues from a research perspective. Available at https://rowman.com/page/IJER 4. See the MOOC Moment, a compilation of essays published by Inside Higher Education for opinions on this question. Available at http://www.insidehighered. com/quicktakes/2013/05/09/mooc-moment-new-compilation-articles-available

REFERENCES Bernard, R. M., Abrami, P. C., Lou, Y., Borokhovski, E., Wade, A., Wozney, L., Wallet, P. A., Fiset, M., & Huang, B. (2004). How does distance education compare with classroom instruction? A meta-analysis of the empirical literature. Review of Educational Research, 74(3), 379 439. Carr, N. (2011). The shallows. New York, NY: W. W. Norton & Company.

12

REDESIGNING COURSES FOR ONLINE DELIVERY

Hill, J. R., Wiley, D., Nelson, L. M., & Han, S. (2004). Exploring research on internet-based learning: From infrastructure to interactions. In D. H. Jonassen (Ed.), Handbook of research for educational communications and technology (2nd ed., pp. 433 460). Mahwah, NJ: Lawrence Erlbaum. Keeton, M. T. (2004). Best online instructional practices: Report of phase I of an ongoing study. Journal of Asynchronous Learning Networks, 8(2), 75 100. Kim, K. J., & Bonk, C. (2006). The future trends of online teaching and learning in higher education: The survey says…. Educause Quarterly, 4, 22 30. Kolowich, S. (2013, April, 8). Coursera takes a nuanced view of MOOC dropout rates. Chronicle of Higher Education. Retrieved from http://chronicle.com/blogs/wiredcampus/ coursera-takes-a-nuanced-view-of-mooc-dropout-rates/43341. Accessed on June 20, 2013. Kompf, M. (2001). ICT could be death knell of professoriate as we know it. Canada’s Voice for Academics, 48(7). Retrieved from http://www.cautbulletin.ca/en_article.asp?articleid = 1766. Langmead, S. (2013, March, 29). Adaptive learning helps personalize instruction for students. eCampus News. Retrieved from http://www.ecampusnews.com/technologies/adaptive-learning-helps-personalize-instruction-for-students/. Accessed on July 3, 2013. McCluhan, M. (1964). Understanding media: The extensions of man. New York, NY: McGraw-Hill. Moore, M. G. (1989). Three types of interaction. In M. G. Moore & G. C. Clark (Eds.), Readings in principles of distance education (pp. 100 105). University Park, PA: American Center for the Study of Distance Education. Naidu, S. (2003). Designing instruction for e-learning environments. In M. G. Moore & W. G. Anderson (Eds.), Handbook of distance education (pp. 349 366). Mahwah, NJ: Lawrence Erlbaum. Parr, C. (2013, May, 10). Not staying the course. Times Higher Education. Retrieved from http://www.insidehighered.com/news/2013/05/10/new-study-low-mooc-completion-rates. Accessed on July 24, 2013. Perry, W. G. (1999). Forms of ethical and intellectual development in the college years: A scheme. San Francisco, CA: Wiley. Prensky, M. (2001). Digital natives, digital immigrants, Part II: Do they really think differently? On the Horizon, 9(6), 1 6. Twigg, C. A. (2002). Improving learning and reducing costs: New models for online learning. Educause Review, 39(5), 28 38.

CHAPTER 2 METAPHOR AS A FRAME FOR COURSE REDESIGN

ABSTRACT Metaphor is a powerful change agent when applied to course redesign. In this chapter, we examine the influence mental models have on our thinking and the potential consequences they have for our learners. By choosing a metaphor to frame our redesign process, we reveal our ideas about our content, our learners, and our instructional style and how they fit together. This all-important first step in the redesign process can be a game changer; leading us to create the kind of learning experience we seek for our students and for ourselves. Metaphor provides means to break away from default patterns of thinking, inspiring us to play and develop new approaches to teaching and learning facilitating the redesign necessary to bring about learning in an online context. We examine real examples of courses redesigned using metaphor, and then we embark on an exploration of other metaphors and their likely influence on decisions related to course redesign. In the end, we revise the course redesign model to include metaphor. Keywords: Course redesign; online teaching and learning; metaphor; mental models

INTRODUCTION This chapter is about breaking out of familiar patterns of thinking so that we can redesign our courses for online delivery. Redesign moves beyond converting course materials to digital files. It requires a new approach like 13

14

REDESIGNING COURSES FOR ONLINE DELIVERY

the one introduced in Chapter 1. The DIME model of course redesign features four sets of considerations related to teaching and learning online: design, interaction, media, and evaluation. The act of considering is not enough by itself to bring about redesign; we need to consider things from a new perspective. By looking at things through a new mental model, we see and respond to things differently. We will use metaphor to facilitate a new approach to thinking about, and planning, our online course. Metaphor is “a figure of speech in which a word or phrase literally denoting one kind of object or idea is used in place of another to suggest a likeness or analogy between them” (Merriam Webster, 2013). In applying metaphor to course redesign, we force ourselves to look at things in a new way, bringing about a fresh approach. Metaphor can be a powerful change agent. According to Jensen in his article titled Metaphors as a Bridge to Understanding in Educational and Social Contexts, the theory of abduction, or “reasoning from,” encourages the application of different lenses to see things from differing perspectives. By so doing, one arrives at an approach or method that allows for “a whole new level of possible understanding” (Jensen, 2006). Let’s look at an example.

AUTHENTIC REDESIGN EXAMPLES The Little Inn at the Crossroads Regina Bento, Professor of Management at Merrick School of Business, University of Baltimore, describes the power of metaphor in developing a course on leadership and spirituality, which she taught to engineers enrolled in an MBA program at MIT. Professor Bento Indicated she was uncomfortable with taking an overly directive approach in a course she saw as more about personal discovery “learning to lead with soul” than learning leadership theory (Bento, 2000). In her article titled, The Little Inn at the Crossroads: A Spiritual Approach to the Design of a Leadership Course, Bento (2000) describes how a vision of a little inn at the crossroads popped into her mind and set in motion a series of ideas that reframed her role and organized the course around the principles of reflection and community. She used a metaphor of her course as a small inn that she envisioned as a working retreat where class members would come together for a short time on their own personal (learning) journeys.

Metaphor as a Frame for Course Redesign

15

With the inn as her mental model, her course content took shape. She created a website that became the metaphorical inn. The metaphor provided a conceptual framework that drove her course design. She saw her role as innkeeper more than professor, in the sense that she wanted to provide the materials to support her “guests,” but didn’t want to dictate how they had to be used. Bento said “the little inn was predicated on the assumption that the adventure of learning to lead from within (with soul), cannot be imposed or described, but it can be facilitated” (2000, p. 653). To facilitate the individualized leadership journey, the inn offered a library stocked with influential works, a studio for exploration and selfexpression through work products, and workshops to bring everyone together to share, build community and facilitate needed reflection. The metaphor was powerful in giving shape to a course that provided learners with a meaningful experience while working to meet challenging learning objectives (Bento, 2000). Metaphor can help us create new mental models by helping us think about our courses differently. It provides a process by which we can unstick our thinking from default patterns, inspiring us to play and develop new approaches to teaching and learning facilitating the redesign necessary to bring about learning in an online context. Metaphor helps us to look at things differently. This can have dramatic effects when it comes to course redesign.

Medical Education as a Symphony James Woolliscroft, Dean of the University of Michigan Medical School and co-author Robert Phillips (2003), redesigned their approaches to medical education using the metaphor of their course as a symphony. They contend that framing medicine strictly as a science obscures the fundamental reality that medicine cannot be practiced as dispassionate science. It is performed in conjunction with an audience (the patient) without whom the process is void of purpose. According to Woolliscroft and Phillips (2003), looking at medical education through the lens of the symphony framed a mental model that encouraged attention to both the technical and interpretive aspects of medicine. It also highlighted the role of the patient in its performance. They argue that a violinist not only needs technical mastery of the instrument, but also the ability to interpret the notes in order to create music. The audience is needed to complete the experience. The symphony metaphor altered the default thinking, that medicine is science, and allowed them to take a new (and better) approach to physician education.

16

REDESIGNING COURSES FOR ONLINE DELIVERY

MENTAL MODELS ARE NOT ALWAYS CONSCIOUS Metaphors That May Be Limiting Your Thinking Metaphor provides a new lens through which to view our content, our learners, and our role as instructors. We may not realize it, but there may already be metaphors in play that are unconsciously directing our thinking about teaching and learning. We see this in the metaphor of the professor as “sage on the stage” (Naidu, 2003). If this is the lens through which the instructor’s role is viewed, then the assumption would be that the majority of the topic knowledge is possessed by the professor whose job is to give it to students through the presentation of facts. The operating ontology would be that education is mostly about providing information. At the center of that process is the professor. Applied to course redesign, the metaphor would likely yield a class consisting of video lectures and text files organized in buckets, one per topic. If the professor knows his or her stuff, and the students take it in and study it, it is expected that learning will occur. Change the metaphor applied to the professor and the design changes. Watch what happens if we view the professor’s role as “guide on the side” (Naidu, 2003). The student is now at the center of teaching and learning, not the professor. The professor coaches the student in developing skills and knowledge to reach course goals. The role of the professor shifts from providing information to guiding students through activities designed to uncover information and practice skills. The ontology here is that education is about designing opportunities to learn rather than providing information. Either of these models can work given apt content, learners, and instructor. It’s a matter of finding the lens that best fits your course and using it to consciously drive your redesign decisions.

Past Experience Can Create Unhelpful Models Problems arise with mental models when they aren’t conscious. For instance, you may have taught a course for 10 years before you decide to bring it online. Whatever your approach has been will likely be your default. But, our experiences can imprison us in ways of thinking that don’t fit new realities. So it is with online teaching and learning. It’s not

Metaphor as a Frame for Course Redesign

17

about digitizing what you do now. We need to break with the old models and determine a new one that works online. Julie Dirksen (2012), in her book Design for How People Learn, argues that as we gain proficiency, we develop mental models that allow us to become more and more efficient at the task. These models are helpful, so long as the task stays essentially the same. But, when the task changes, the old mental models can get in the way. We default to them instead of looking at things anew. So if developing courses is like riding a bike for you, you are apt to apply old ways of teaching to a context that requires new ways. Dirksen (2012) uses the example of the American golfer Tiger Woods attempting to change his golf swing. His game takes a hit at first as he unlearns the old way and becomes proficient at the new way. But, if the new way is what’s needed, he’ll be ahead in the end. So it may be for you and your students. We want to guard against habitual rather than reflective practice when developing learning experiences, especially in online contexts. We also need to consciously explore the mental models we choose for our redesign to ensure they shape our course in ways that fit the content, the learners, and the instructor. Deborah Appleman (2010), Professor of Educational Studies at Carleton College, employed the scaffolding metaphor in her book on adolescent literacy and reading for literature teachers. This is a popular metaphor that’s been used as a conceptual framework by educators for nearly 30 years (Bruner, 1975; Paliscar, 1986; Paliscar & Brown, 1984). Scaffolding equates to structured support provided by the instructor through interaction, which guides the thinking and actions of the learner so they can reach proficiency (Dyson, 1990). Appleman (2010) admits that scaffolding makes certain assumptions about her students and the process of learning to read that might not fit for primary grade instructors. …I’d never really thought too much the implications of the metaphor behind that word scaffolding. I hadn’t considered the limits the rigidity, uniformity, and linearity such a “building” metaphor might imply, leading to one-size-fits all, teacher-directed support in the classroom. (p. 56)

Appleman (2010) speaks to the power of mental models and how their influence on our thinking is often unconscious. By deliberately choosing a metaphor through which to view our course redesign, we move out of default thought patterns and become more aware of our choices. Such was the case for Anne Haas Dyson in applying a different metaphor for

18

REDESIGNING COURSES FOR ONLINE DELIVERY

teaching literacy skills. She used the metaphor of weaving. Dyson (1990) explained her choice this way: Scaffolding is a vertical metaphor, one that represents how more skillful others support children’s progress within one activity, weaving has a more horizontal dimension. It suggests how [students’] progress in any one activity is supported by their experiences in varied activities. (p. 204)

In other words, the path and pattern to proficiency will be unique to each learner based upon how the individual integrates information and practice. The change in metaphor from scaffolding to weaving alters the approach to the teaching and learning of literacy skills. Instruction becomes less prescriptive and more focused on supporting student exploration and application to facilitate integration of knowledge and skill.

METAPHOR IS A FRAME AND AN IDEA GENERATOR Again, either metaphor can work as a redesign framework, but each reflects a unique view of the learner and instructor roles. These distinctions drive the redesign process differently. Metaphors are not one-size-fits-all; the choice of which mental model to use isn’t neutral. The trick to finding the right lens is to try out a variety of metaphors to see which best captures the essence of the experience you want for your learners and yourself. Metaphors each highlight particular aspects of teaching and learning, so choose the one that best encapsulates the needs of the students and inspires you to think about teaching differently.

Another Authentic Example We use metaphor as part of the redesign process, but it need not be shared with the learners to be effective. In redesigning a course in business communication, I began thinking of my course as a baseball game. For those not familiar with the game, baseball is played with two teams that take turns “at bat” in an attempt to “score.” To score, batters must get hits and advance around four bases to earn a “run.” Bases are evenly spaced around the infield in a diamond shape. The better the hit, the farther around the bases a batter will go. If he gets to first base on his own hit, it’s a “single.” If he gets to second base, it’s a “double.” Third base is a “triple,” and all the way around the bases to

Metaphor as a Frame for Course Redesign

19

home base is a “home run.” A batter is “out” if he fails to hit the ball after three tries, or an opposing player catches the ball in the air, before it touches the ground. A player is also out if an opposing player possessing the ball gets to the base before the batter does. A team gets three outs before having to take the field. The winning team is the one with the most runs at the end. There are more details, but those are the essentials to understand the metaphor.1 In thinking about my course as a baseball game, I saw learners as batters and the products they would produce as hits. Students would learn a given theory and then apply it to explain a situation and develop a response using particular writing conventions. It was assumed that at the beginning, learners would be unlikely to achieve mastery. I saw the initial submission of an assignment as advancing them to first base, before they could advance another base, they would need to improve their skills. This set up the idea that for each lesson, there would be structured opportunities for revision and feedback. By the end of the course, better students would be hitting home runs on their first tries no revision needed. So each lesson built on the previous, and students strove to go from good to better and from better to best. In developing this course, I had the good fortune to be working with a graduate assistant. She was watching my process and asked me whether I thought the baseball metaphor would appeal to all students. I hadn’t thought about what the model I chose might do to my learners’ thinking. Sometimes it will make sense to share your model, as Professor Bento (2000) did with the Little Inn at the Crossroads. In my case, the model would add little meaning to my students, and it might even get in the way. So I chose another model for my learners. I outwardly set the course in the context of an office and created workspaces using visual cues created through Flash. These workspaces allowed students to sit in virtual cubicles, open files, and advise clients in responding to customers and management through a variety of document styles (letters, memos, fact sheets, and newsletters). This inspired me to cast students in the role of communication consultants. Like batters who must respond to varying types of pitches, learners as consultants would need to respond to varying situations. I maintained the good, better, best approach to the assignments developed through the baseball metaphor and took on the role of their manager, advising them and eventually evaluating their performance. This worked better than expected. By setting the course in an office and giving learners a professional title, they took ownership of their work in ways I hadn’t predicted. They weren’t

20

REDESIGNING COURSES FOR ONLINE DELIVERY

just playing the role of consultant; they took ownership of the role. They produced professional quality documents that demonstrated deep understanding of the course material and the task they were undertaking. With some encouragement, they regularly sent their work out to their peers for review, getting it to better and best before it reached my desk.

Metaphor is an Essential Redesign Tool Metaphor functions as both framework and idea generator. Later, as you undertake the redesign process, your metaphor will be running in the background, inspiring you to think differently and influencing the choices that you make. I chose the lens of a baseball game in redesigning my course, had I chosen a different metaphor, my course would have ended up being different, not necessarily better or worse, but different and not as good a fit for my content, my students, and my instructional style. Tables 2.1 2.4 depict four other metaphors and suggest how they might alter the design of a course in business communication.

Exploring How Different Metaphors Uniquely Influence Redesign For each metaphor, “general view” reflects the instructor’s perspective about the learning experience. “Application” and “content fit” reflect the Table 2.1.

Playground Metaphor.

General view

Experiment with content. Play with concepts/skills

Application

“Play stations” provide opportunities to try skills: aim is to figure things out; many tries and immediate feedback would be essential. Slide activities that are fairly straightforward Seesaw activities that require a balanced approach. Swings activities that require some risk taking, but are fun Monkey bars activities that require effort and drive

Content fit

Activities are smaller in scope and focus on discrete skills/concepts. For instance; write an effective email subject line, introduce yourself to someone at a networking dinner, negotiate with your boss for a day off.

Student fit

Likely to fit with student experience. Better suited to survey style courses where exposure to concepts/skills is the goal, rather than mastery

Metaphor as a Frame for Course Redesign

Table 2.2.

21

Marketplace Metaphor.

General view

Choose relevant content. Combine concepts/skills

Application

Storefronts each represent a skill or concept to master. Listening store Speaking store Writing store Graphics store Persuasion store Audience store

Content fit

Projects involve a variety of skills that come from different sources. Preparing a sales letter, combine skills from the audience, persuasion, and writing stores.

Student fit

Likely to appeal visually and support the idea of interconnectedness of the skills. Best if the focus will be on communication channel rather than other elements such as relational context.

Table 2.3.

Safari Metaphor.

General view

Seek contextualized content. Apply the concepts/skills

Application

Focused journey with a specific goal in mind (i.e., become a better communicator). Different animals represent different interaction partners. Lion a dominating boss Zebra competitive yet cooperative peers Giraffe the hard to reach customer Elephant stubborn or difficult people

Content fit

Projects will focus on context over content. Problem-based scenarios with students planning and executing various interaction strategies to reach desired outcomes.

Student fit

Won’t be easily understood as a forward facing metaphor, but could be used as a framework. Works well if the objectives focus on communication context rather than communication channel

approach to presenting and interacting with content. “Student fit” reflects the instructor’s view of the learners. Looking across the four metaphors, we see differences related to content, learners and how they will interact with the content, and the role the instructor will likely play. Content varies in terms of how interdependent the various elements appear to be. They run along a continuum from discrete elements (in the playground metaphor) to integrated (archeological

22

REDESIGNING COURSES FOR ONLINE DELIVERY

Table 2.4.

Archeological Dig Metaphor.

General view

Discover the content. Interpret with concepts/skills

Application

The “dig site” has various concepts/skills to be excavated, identified, and assessed. Artifacts might represent common processes constituted through social interaction: Organizational Culture Organizational Structure Organizational Identity or brand Decision making

Content fit

Projects will focus on describing and assessing abstract, social processes that influence organizational life. For instance, examining the influence of new communication technologies on the way information moves through an organization.

Student fit

Advanced students may find this metaphor rich and inspiring. Would work better for a course in organizational behavior than business communication.

dig metaphor) with strong clues about how they will be united in the general view (combined, applied, interpreted). The interdependence of the elements has implications for learner prerequisite skills and experience. The playground metaphor works well for learners new to the content or those lacking prerequisite skills. The marketplace metaphor assumes more skill, but not as much as would be needed for students in courses designed using the safari or dig metaphors. Finally, the role of the instructor appears to be directive for the playground metaphor, which features activities that will require feedback, and facilitative for the safari and dig metaphors. More complex projects will benefit from guidance, but will not conform well to everyone moving rigidly through the elements. Learners will need to take the initiative and the instructor will need to provide support and reinforcement. Metaphor Reflects Your Views on Content, Learners, and Instructor Now you try it. Apply each of the metaphors in Tables 2.1 2.4, to begin the redesign of your course. Choose the one that fits your learners, your content, and your style best. If none of them seem to fit, try playing with others: the ocean, leaves on a tree, rock-climbing wall, Shakespearean drama, trading floor of the stock exchange. Use your imagination; you

Metaphor as a Frame for Course Redesign

Fig. 2.1.

23

Metaphor Reflects Your Views about the Content, Learners, and Instructor and How They Fit Together.

may be surprised at the perspectives you uncover. Think of the metaphor you choose as a mirror, reflecting back to you your conscious, and not so conscious views, about how you, your learners, and your content fit together. These views are what drive the redesign process and influence the choices you make. Fig. 2.1 illustrates the point that your choice of metaphor is input to the process, but it is also a reflection of your views, whether conscious or unconscious. Raising awareness of the power of our mental models helps ensure we’re satisfied with our choices.

REVISITING DIME AS METAPHOR The course redesign process is about creating a new mental model for your course rather than having it determined for you by the tools you use or the way you’ve done things in the past. The approach we choose should be determined independently, reflectively, and with a particular end in mind

24

REDESIGNING COURSES FOR ONLINE DELIVERY

(Parker & Ingram, 2011). This volume offers a process by which you can develop your approach using the course redesign model introduced in Chapter 1. The DIME model is more than an acronym representing the phases of redesign: Design, Interaction, Media and Evaluation. It’s also a metaphor that captures the changing nature of teaching and learning today. It also reflects my views about this process and how I should present it to others. The metaphor features the American 10 cent piece, the dime. It represents 1/10th of a US dollar. In the United States, monetary units worth less than one dollar are coins. We refer to coins collectively as “change.” The play on words was appealing, but the metaphor does more than just point to the need for a change in approach to course design. The dime is no ordinary piece of change. Americans have many sayings revolving around this, our smallest, but not our least valuable, coin. Our least valuable would be the penny worth 1/10th of a dime, despite its larger circumference. This littlest coin has inspired idioms such as “turn on a dime,” which refers to a radical change in direction in a small space (Lighter, 1997). Arguably, our method to course redesign is a radical departure from the all too common approach of replicating the face-to-face course design in digital format. We may also need to make our choices quickly, before they are dictated by a technology purchase or some other impediment. Another idiom featuring the dime is “get off the dime,” which means to get moving or get started, although there is a lack of consensus about its original meaning (Safire, 2002). Then there are a pair of idioms that seem to reflect the concern that without a new approach to course redesign, you’re likely to be disappointed in the experience: “not worth a dime” and “dime a dozen” (n.d.). The former idiom means poor quality; the latter implies something is so common that it is of very little value (Ammer, 1997). The metaphorical and idiomatic phrases tied to this small coin, highlight the changing nature of pedagogy in the age of technology and our need to keep up!

THE REVISED COURSE REDESIGN MODEL Like the course redesign using the baseball metaphor, it is not critical that you, the reader, buy into the metaphor. Rather, it is an organizing scheme. The lens is for the creator rather than the consumer. It reflects my views about the fit of my content, my readers, and me. Fig. 2.2 adds metaphor to the course redesign process. The arrows between each of the elements

Metaphor as a Frame for Course Redesign

25

Fig. 2.2. Course Redesign Process, Featuring Metaphor as the Lens through Which the Remaining Elements Are Viewed.

indicate that each phase provides content for and influences the choices made in the next phase. Metaphor is at the top as it represents the overall lens through which you view the entire redesign process.

CONCLUSION Start the redesign process by finding your metaphor. If you undergo this process in earnest, you will redesign a course that becomes the kind of experience you want for your learners and for yourself. In the next chapter, we begin considering elements related to basic course design. These consist of determining your course objectives and identifying the instructional methods. We will follow a detailed process to create a blueprint for your course. Metaphor will play a role in these critical decisions.

NOTE 1. If you’re interested in learning more about American Baseball, visit Major League Baseball’s website http://mlb.mlb.com or follow this link to the official rules: http://mlb.mlb.com/mlb/downloads/y2012/Official_Baseball_Rules.pdf

26

REDESIGNING COURSES FOR ONLINE DELIVERY

REFERENCES Ammer, C. (1997). Dime a dozen. In The American Heritage® Dictionary of Idioms. New York, NY: Houghton Mifflin Company. Retrieved from http://dictionary.reference. com/browse/dime a dozen. Accessed on July 24, 2013. Appleman, D. (2010). Adolescent literacy and the teaching of reading: Lessons for teachers of literature. Urbana, IL: National Council of Teachers of English. Bento, R. (2000). The little inn at the crossroads: A spiritual approach to the design of a leadership course. The Journal of Management Education, 24(5), 650 661. Bruner, J. (1975). The ontogenesis of speech acts. Journal of Child Language, 2, 1 40. Dime a dozen [Def. 1]. (n.d.) In Thesaurus.com. Retrieved from http://thesaurus.com/browse/ dime + a + dozen. Accessed on June 28, 2013. Dirksen, J. (2012). Design for how people learn. Berkeley, CA: New Riders. Dyson, A. H. (1990). Weaving possibilities: Rethinking metaphors for early literacy development. The Reading Teacher, 44(3), 202 213. Jensen, D. (2006). Metaphors as a bridge to understanding in educational and social contexts. International Journal of Qualitative Methods, 5(1), 1 16. Lighter, J. E. (1997). Historical directory of American slang (Vol. 2). New York, NY: Random House. Metaphor. (2013). In Merriam-Webster.com. Retrieved from http://www.merriam-webster. com/dictionary/metaphor. Accessed on July 24, 2013. Naidu, S. (2003). Designing instruction for e-learning environments. In M. G. Moore & W. G. Anderson (Eds.), Handbook of distance education (pp. 349 366). Mahwah, NJ: Lawrence Erlbaum. Palinscar, A. S., & Brown, A. L. (1984). Reciprocal teaching of comprehension-fostering and comprehension-monitoring activities. Cognition & Instruction, 1(2), 117 175. Paliscar, A. (1986). The role of dialogue in providing scaffolded instruction. Educational Psychologist, 21(1&2), 73 98. Parker, R., & Ingram, A. (2011). Choosing online collaboration systems: Functions, uses, and effects. Journal of the Research Center of Educational Technology, 7(1), 2 15. Safire, W. (2002, October 6). Off the dime. In The New York Times Magazine. Retrieved from www.nytimes.com/2002/10/06/magazine/06ONLANGUAGE.html. Accessed on February 10, 2013. Woolliscroft, J. O., & Phillips, R. (2003). Medicine as performing art: A worthy metaphor. Medical Education, 37, 934 939.

CHAPTER 3 DESIGN CONSIDERATIONS

ABSTRACT Instructional design involves the identification of strong learning objectives and the selection of instructional methods to accomplish them. In this chapter, we consider how to write online course objectives that will serve as a foundation for future redesign decisions. Strong learning objectives are observable, measurable, attainable, and specific. They are focused on the needs of our target audience and should fit with our instructional philosophy as reflected by our metaphor. We explore how individual differences, such as demographics, personality, past performance, and expectations can affect learner needs and preferences, which should inform learning objectives and instructional methods. We structure the design process around decisions related to four essential activities that instructional methods should facilitate: the sharing of information, the demonstration of skills, the ways for learners to practice skills, and the means to ensure learning has happened. We concentrate on selecting general methods of instruction, which we will later refine and adapt for online delivery. We walk through the DeSIGN process in detail, determining strong objectives and exploring how to use them in identifying instructional methods. Intersections between these decisions and future redesign considerations are also discussed. Keywords: Course redesign; online teaching and learning; instructional design; learning objectives; instructional methods; online education.

27

28

REDESIGNING COURSES FOR ONLINE DELIVERY

INTRODUCTION Course redesign continues with the consideration of elements related to basic instructional design. Throughout the redesign process, our decisions will be filtered through the lens of our chosen metaphor. The DIME model separates course redesign into four major sets of considerations. Considerations will lead to decisions about (D) design, (I) interaction, (M) media, and (E) evaluation. We will consider each set in turn. It is important to note, however, that choices made in one phase of redesign will overlap with those made later. Decisions are interdependent and reciprocally influential; the phases of redesign are iterative and should be viewed as ongoing. The model provides a useful framework for structuring decision-making to ensure we attend to all of the major considerations. There is no single, best way to redesign a course. It is a contextually relative process. In other words, effective redesign depends upon the situation or context (Perry, 1999). The results will vary based upon the content, learners, and instructor. To understand your context, we begin the process by analyzing your target audience, the learners, to determine who they are, what their needs are, and what expectations they may bring to the learning situation. We will use this information throughout the redesign process, but perhaps at no time more than during the instructional design phase. In the next section, we survey past research results related to individual learner differences that should be considered as you redesign. Afterward, we investigate the elements that comprise the design phase: determining objectives and identifying general instructional methods to be used in accomplishing them.

ANALYZING YOUR TARGET AUDIENCE Learners are at the center of the redesign process. Who are they? What are they ready to do? What potential support for, or impediments to, learning exist? To explore these questions, we turn to the work of Eddy, Donahue, and Chaney (2001) who use a contextual relative approach in identifying a variety of factors that impact learners’ abilities to succeed in an online environment. The most relevant of these factors for course redesign are those they identify as intrapersonal in nature. Intrapersonal factors represent a number of characteristics that relate to the knowledge, attitudes, and beliefs learners bring to online learning (Eddy et al., 2001).

29

Design Considerations

Students’ attitudes toward learning have been conceptualized in a variety of ways and tied to outcomes from satisfaction to learning orientation. For instance, John T. E. Richardson of The Open University in the United Kingdom conducted a series of studies (with and without co-authors) and found that intrapersonal variables such as age, gender, and prior qualifications (performance on qualifying exams) all influence, student perceptions of the learning environment and their approach to learning (Richardson, 2006). Other factors previously found to be influential on student satisfaction and performance are personality, skill/knowledge competency, and expectations for online learning. These factors have been studied in a variety of ways but rarely in isolation. Confounding effects plague many studies, which is why we present them as considerations rather than prescriptions for redesign.

Demographics Studies have found demographics such as age and ethnicity play a role in student engagement in the learning process. For instance, DiBiase and Kidwai (2010) found older adult students were more independent in their learning, even through technology. Jost, Rude-Parker, and Githens (2012) recently found that age and ethnicity were predictors of performance among students enrolled in two-year colleges in Kentucky. However, those differences disappeared as predictors when controlling for past performance. Grade Point Average (GPA) became the only predictor of final course grade. In a study conducted at Open University in the United Kingdom, Richardson (2012) found that ethnic minority students (identified as black, Asian, and Chinese) were less likely to successfully complete courses than their white counterparts. These findings held true regardless of age, gender, socioeconomics, or prior school performance. There are many potential explanations for findings such as these. Previous research has suggested that older students come to the classroom with an increased sense of why they are there and how the educational experience applies to their lives (Knowles, Holton, & Swanson, 2005; Sitzmann, Kraiger, Stewart, & Wisher, 2006). Links between ethnicity and performance have been explained as resulting from differences in cultural expectations and fit (Ogbu, 2003). Gender differences have been found to strongly influence perceptions of instructional technology (Parker, Bianchi, & Cheah, 2008).

30

REDESIGNING COURSES FOR ONLINE DELIVERY

A meta-analysis conducted by Severiens and ten Dam (1994) suggested gender differences might actually be due to learning styles, which they found varied by gender. Males were more likely to prefer an abstract learning style; females were more likely to prefer a more concrete learning style. These learning styles came out of the work of David Kolb (1981) who describes concrete learning styles as exhibiting a preference for practical application and abstract learning styles as preferring more abstract conceptualization. Learning style can play out in preferences for course structure and technology use. For instance, Hoskins and van Hoof (2005) found increased course Internet use among males. Heffner and Cohen (2005) found increased course use of WebCT (a learning management system) by females. The emergent nature of the Internet would fit with the goals of someone with an abstract style, looking for several possible explanations. The more structured nature of WebCT would better enable application. Caution should be used in considering demographic differences among learners; findings are associative rather than causal. For instance, race has not been causally linked to lower performance. Rather, commonly associated circumstances such as socioeconomic status or urban/rural upbringing, will affect the way learners understand examples or react to course redesign.

Past Performance Differences among learners in past performance have been found to influence learner preferences about course structure and technology use. In a study of 3,145 undergraduate students at a large Midwestern university in the United States, cumulative GPA was found to influence learner perceptions of how helpful classroom technologies such as PowerPoint and Web CT were. Actual GPA data was used in conjunction with scale measures of learners’ perceptions about the technologies. The higher the GPA, the less helpful the classroom technologies were perceived to be, when all other factors were held constant (Parker et al., 2008). GPA has also been found to be predictive of success and persistence in online courses (Levy, 2004; Wojciechowski & Palmer, 2005). As with demographics, we urge caution when considering past performance data. Lower GPA would not preclude students from persisting and succeeding in online courses, a different approach may be warranted. For instance, DiBiase and Kidwai (2010) suggest instructors need to be more proactive in stimulating participation

31

Design Considerations

among some students and develop a means to explicitly evaluate tacit learning from online discussions to further support learning.

Personality and Learning Style The influence of personality and learning style in online environments are additional intrapersonal factors to consider. In a study examining personality and preferences for online vs. face-to-face classes, students scored as introverts on the Myers Briggs Type Indicator preferred online classes; students scored as extroverts preferred face-to-face classes (Harrington & Loffredo, 2010). Open-ended response items provided insights into why each personality type preferred their chosen teaching mode. Introverts reported liking the convenience of online classes along with expressing enjoyment in computer technology, a desire for innovation, and time for reflection. Extroverts reported that the structure of a face-to-face class allowed them to learn by listening, which they favored. They also reported that face-to-face classes allow them to better gauge the emotional reactions of others, which was important to them. As with demographic characteristics, the results above are not causal and they aren’t always consistent. In a study examining Facebook behavior and personality, results indicated that offline personality is extended by online behaviors (Gosling, Augustine, Vazire, Holtzman, & Gaddis, 2011). Students scoring high on extraversion on the Big-5 personality test were found to post more frequently on Facebook, to comment more on others’ statuses and pictures, and to post more content on their own profile. Although this study did not examine learning, the findings are similar to those of Blau and Barak (2012) who found extroverted students preferred a richer, synchronous medium for discussion and more introverted students preferred leaner mediums such as text chat. Taken together, personality research suggests you should look for opportunities to better engage extroverts online, giving them more opportunity for synchronous interaction so they can learn by listening and better experience the emotional reactions of others. But it needs to be done in such a way that it maintains the characteristics valued by introverts such as convenience and time for reflection. Considerations such as these will inform decisions about instructional methods as well as choices related to interaction and media. One final set of intrapersonal considerations we will explore relate to student motivation and expectations of learning online.

32

REDESIGNING COURSES FOR ONLINE DELIVERY

Expectations of Online Learning Students bring experience and expectations into the online classroom. For instance, many students take online classes thinking they’ll be easier (Nash, 2005). Others believe they will take less time (Pierrakeas, Xenos, Panagiotakopoulos, & Vergidis, 2004). Misconceptions such as these are reflected in the disproportionately high rate of attrition in online classes. Recent studies report dropout rates as high as 50% for online learners (Aragon & Johnson, 2008; Nichols & Levy, 2009). In light of recent reports that the dropout rate for online classes is 15 20% higher on average than for face-to-face classes, researchers at Kennesaw State University in Georgia, United States, set out to explore why. One of their findings was student expectations didn’t match course realities; learners thought the online class would be easier. Traditional course-retention strategies such as emails and phone calls from the instructor had no effect on retention rates, which were just 70%; the retention rates were 69% in the control group (Leeds et al., 2013). Motivation also shapes learner expectations. A study of 103 students enrolled in a yearlong occupational training program used a pretest/posttest design to measure student expectations and their motivation to learn for each course before it began. Afterward, they measured their reactions and actual learning. Sitzmann, Brown, Ely, Kraiger, and Wisher (2009) found course expectations strongly predicted motivation to learn; motivation to learn positively influenced trainee reactions to the course, and trainee reaction predicted expectations for future courses. The results indicate a need to attend to learner motivation, specific to our course content and design. The better job we do in planning our redesign, the more effectively we can set (or reset) learner expectations while encouraging motivation to learn. Richardson suggests that the students’ learner type will make a difference in his/her expectations and reactions. They found differences in student perceptions of a course based upon whether they were meaningoriented or reproduction-oriented. Meaning-oriented learners seek to fully understand the content; sometimes referred to as deep level learning. Reproduction-oriented learners are the equivalent of surface learners, seeking mainly to memorize for assessment rather than understanding (Lawless & Richardson, 2002; Richardson, 2003). Richardson (2005) explored students’ perceptions of the learning environment using the Course Experience Questionnaire. Integrating his results with data from two previous studies, Richardson (2006) concluded that students’ overall satisfaction with a course is higher when they perceive course assessment and workload to be

Design Considerations

33

appropriate, goals and standards to be clear, course materials to be interesting, and skills required to enhance their overall confidence in their performance. Together, the previous findings suggest the need to identify student expectations and reset them if necessary. To set expectations, first articulate clear learning objectives so that students can determine the fit between their learning needs and the course. Expectations will also extend to instructional methods used to reach the objectives and (hopefully) make the course interesting and engaging. In the next section we work on the first element of DeSIGN,1 determining the course objectives.

DETERMINE OBJECTIVES A centerpiece of good design is creating strong learning objectives that state specifically what students will be able to do as a result of taking the course. Online teaching and learning is a natural environment for learning by doing. In a learn-by-doing environment, it is important to write course objectives in terms of behaviors. In other words, objectives should be written so they describe what the learners will be able to do after completing the course of study. This is true whether your goals are declarative or procedural in nature; procedural goals are more obviously action-oriented, but declarative objectives can be written so they too are behavioral.2 Ultimately, objectives should paint a picture of how students will be changed as a result of their learning experience. Strong learning objectives set up goals for what learners will be able to do, and they also indicate how well they should be able to do it. The objectives will strongly influence all of your course redesign decisions. Setting your objectives is perhaps the most crucial decision you’ll make in the overall process.

Characteristics of Strong Learning Objectives Strong learning objectives have four characteristics. They are: (1) observable, (2) measurable, (3) attainable, and (4) specific (Arnold & McClure, 1995; Parker, 2005). It takes clear thinking, discipline and some practice to master this skill. We begin with an example and then some practice in recognizing the characteristics.

34

REDESIGNING COURSES FOR ONLINE DELIVERY

Unpacking the Four Characteristics Using an Example Example Learners will demonstrate their knowledge of eight theories of motivation by listing their elements with 80% accuracy and applying all of the elements of a single theory to devise methods to improve employee morale and productivity in a given situation.

Make Outcomes Observable The first characteristic of strong objectives is that they are written so they result in a behavior, which is observable. Even objectives for survey courses with goals to increase declarative knowledge can be phrased in terms of behaviors. To be observable, you should be able to see your learners perform the action or see the tangible results of their having performed it. In the example, we want learners to know the elements of eight theories, but you can’t see “know.” By phrasing the objective as a behavior, listing the elements, their knowledge becomes observable. Learners produce lists of elements, an action that is indicative of their increased knowledge of the theories. In the example, one of the observable results will be the list of theory elements, but there are also other parts that are observable. For instance, the results of learners’ application of the elements of one theory to a situation will be visible through the method they devise for improving employee morale. There is no need to know exactly how learners will accomplish this yet (i.e., via case study analysis or labeling elements in a scenario). Those decisions will be made later when we choose our instructional methods. Making objectives observable can be challenging. It requires distilling all of the information learners should know to arrive at a limited set of concisely phrased learning objectives. The process, while initially demanding, eases future decisions. Instructional method flows naturally from behavioral objectives. Make Outcomes Measurable The second characteristic of strong objectives is they are measurable. As part of objective setting, we need to indicate how much or how well learners will perform the outcome behaviors. By presetting standards, we bring into clearer view what it is we want students to be able to “do” at the end of our course. Consider your target audience and the skills they will likely bring to the learning environment; consider where they need to be at the end to meet the needs of future courses or job requirements. Set standards

Design Considerations

35

high enough to make the objective challenging, but not so high as to make them unachievable for students.3 Standards have strong implications for how assessment of learning will be carried out. For example, accuracy in reciting elements of a theory would easily lend itself to a standard examination or perhaps a verbal presentation. Application would lend itself well to case study or a consultative project. How the measurement will happen will be partly a function of instructional method and partly a function of decisions made during the evaluation phase of course redesign, which we examine in Chapter six. Make Outcomes Attainable The third characteristic of strong objectives is they must be attainable. This is partly about the standards of performance by which outcomes will be measured, but it’s also about time and resources. Most learners must be able to achieve the objective given the starting state of their skills and knowledge, in consideration of the time and other resources that are available to support accomplishment of the objective. In our example, it would be unlikely that learners would have enough time to learn the procedural elements of the theories in order to apply them if the class only met for three weeks. Learners could probably list the elements of a subset of theories and perhaps recognize their application. It is important to be realistic about what can be achieved in light of the circumstances. Make Outcomes Specific The final characteristic of strong objectives is that they are specific. This ensures you know precisely what you want learners to be able to do at the end of the course. In the example, we added specificity by including the numbers of theories to be covered. This also enhances the measurability of our goal and will help in making content decisions.

Recognizing the Four Characteristics of Strong Learning Objectives Writing strong objectives takes practice and revision. Table 3.1 depicts four revised versions of the example objective. Note how more detail is added to the objective until it achieves the clarity and strength needed to shape future decisions. The presence of each characteristic is indicated with an X for all versions of the objective. If the characteristic is missing it is marked with a dash; if it is unclear it is marked with a question mark.

36

REDESIGNING COURSES FOR ONLINE DELIVERY

Table 3.1.

Revising Learning Objectives for Strength and Clarity.

Objective

O

Learners will demonstrate their knowledge of theories of motivation through listing their elements and applying them accurately to devise methods to improve employee morale and productivity.

X

Learners will demonstrate their knowledge of theories of motivation through listing their elements with 80% accuracy and applying each of the elements of the theory to devise methods to improve employee morale and productivity.

X

X

?

?

Measurable in that we know students need to be 80% accurate in listing the elements of the theories. More specificity is still needed; this goal may be unattainable within a 15-week semester.

Learners will demonstrate their knowledge of eight theories of motivation by listing their elements with 80% accuracy and applying all of the elements of a single theory to devise methods to improve employee morale and productivity in a given situation.

X

X

X

X

Specific in that the focus is on eight theories for recitation and one for application.

O

Observable; M

Measurable; A

M

A

S

?

Attainable; S

Explanation Observable because we can observe a student listing (either verbally or in writing). Not measurable because we need a means by which to determine how well or how much. More detail is needed to determine attainability and specificity.

Specific.

Target Audience Reflected in Objectives and Metaphor Choice Objectives are clearly informed by knowledge of your target audience. For instance, if learners lack previous experience with the content, you might limit objectives to lower order skills such as knowledge, comprehension, and application and sequence-learning activities to move students from declarative to more procedural outcomes (Bloom, Engelhart, Furst, Hill, & Krathwohol, 1956). As learners become more advanced in their

Design Considerations

37

program of study, your objectives might involve analysis, synthesis, and evaluation only. Knowledge of your target audience likely influenced your choice of metaphor, whether consciously or unconsciously. An instructor choosing a playground metaphor likely sees learners as needing to play with ideas and the role of the professor as arranging the environment to encourage exposure, but not necessarily mastery. Each topic would likely be organized as a discrete experience with learners moving through them like stations on a playground. An instructor choosing a weaving metaphor likely sees learners as ready to make connections between topics, integrating them into more advanced actions such as problem solving and decision-making. A weaving metaphor fits with a view of the instructor’s role as facilitator of connections between learner and content and learner and course peers. Discussions and group work are common methods used to facilitate such connections. In the next section, we investigate how learning objectives inform selection of instructional methods and make some preliminary decisions.

PLAN INSTRUCTIONAL METHODS The process of choosing your instructional methods is creative and (hopefully) fun. Inspired by your metaphor and your view of your students’ needs, break out of old patterns of instruction; dare to try new things. In the section ahead, we’ll use a structured process for making choices about basic instructional methods. In Chapters 4 and 5 we plan the implementation of these methods using technology to enable and facilitate them in online learning. Decisions are iterative, however. Those made at each phase may suggest revisions to previous decisions. There are four essential decisions involved in identifying instructional methods that directly support achievement of learning objectives. They include: (1) decisions about the information our learners need, (2) decisions about which skills they will need to be shown, (3) decisions about how to help them practice those skills, and (4) decisions about how to ensure the learning will stay with them beyond the class. To help structure decisions in the design process, we rephrase them to spell the acronym DeSIGN: (De) Determine objectives, (S) Share information (I) Illustrate skills (G) Guide practice (N) Nurture progress (Parker, 2005).

38

REDESIGNING COURSES FOR ONLINE DELIVERY

Next, we investigate each of the four essential decisions along with potential instructional methods to use in carrying them out. To illustrate this process, we revisit our example objective to identify aspects to be addressed by our methods. Table 3.2 reminds us of the objective and synthesizes the process by highlighting aspects of the objective, identifying the set of DeSIGN decisions that are most relevant, and listing representative instructional methods to be used in implementing them. We’ll walk through this process, describing each of the decisions and how they fit with

Table 3.2.

Plan Instructional Methods from Course Objectives.

Objective Learners will demonstrate their knowledge of eight theories of motivation by listing their elements with 80% accuracy and applying all of the elements of a single theory to devise methods to improve employee morale and productivity in a given situation. Element comprising the objective

DeSIGN stage

Potential methods

Learners will list the elements of eight theories of motivation

(1) Share information (2) Guide practice (3) Nurture progress

(1) Reading; lecture; video (2) Flash cards; jeopardy (3) Quiz/exam

Learners will apply all of the elements of a single theory to a given situation

(1) Illustrate skills (2) Guide practice (3) Nurture progress

(1) Demonstration using theory checklists (2) Observation journals to gather situations for analysis; group mini case applications (3) In-class sharing of journals/ group applications; structured feedback

Learners will devise methods to improve employee morale and productivity in a given situation informed by theory.

(1) Share information (2) Illustrate skills (3) Guide practice (4) Nurture progress

(1) Lecturette on methods and outcomes tied to theories; reading of relevant case examples (2) Demonstration of devising methods informed by theory (3) Extension of group mini-cases from previous element to include methods (4) Larger case study application project (paper/presentation); application exam.

39

Design Considerations

the objective and methods. We begin by unpacking the objective to identify information that will need to be shared with learners.

Share Information Learners will need some information in order to ultimately perform the behaviors set out in the objectives. In our example objective, students will need information about the eight theories in order to recite their elements and later apply them. There are many methods by which instructors share information, but the classics are to assign readings and/or lecture. Instructors might also facilitate information sharing by assigning students to conduct primary and secondary research and share it with their peers through presentations, discussions, or jigsaw learning (Aronson, 2013). Any, and all of the methods above, can be adapted to online learning environments, but decisions about how will come in later phases of the redesign process. For now, just decide what type of information needs to be shared and the best method through which to do so. For instance, you can “tell them,” “refer them,” or “entice them.” Telling them may equate to a video lecture, while referring them might be giving them a reading assignment. Enticing them could be a scavenger hunt or other activity through which learners uncover the information. Later decisions, made during the interaction and media phases of redesign, will further refine decisions about how instructional methods will be implemented. Next we revisit the example objective to identify the skills learners will need demonstrated so they can perform them.

Illustrate Skills Many of us learn behaviors best when they are modeled for us. To behave in the ways set out by the course objectives, learners will need critical skills illustrated for them. In our example objective, learners will need to see how the elements of theories should be applied to situations so they can devise interventions. Skills are typically illustrated through demonstration. The instructor can direct learners through the process of applying the elements of a theory to arrive at a specific intervention using examples. This step will likely come after students have been introduced to the theories through

40

REDESIGNING COURSES FOR ONLINE DELIVERY

whatever means chosen to share information. Skills can also be illustrated through work samples, simulations, and role-plays. As with information sharing, methods for illustrating skills can be adapted for online learning; decisions about specific adaptations will be made during the interaction and media phases of redesign. For now, we will decide what skills need to be illustrated and the general methods through which to show them. Next we’ll look at how to guide learners in practicing the skills needed to accomplish course objectives.

Guide Practice In order to actually perform the skills, learners will need to practice them. In our example objective, there are three skills they will need to demonstrate accurately. Learners will need to list (elements of theories), apply (the elements of a single theory), and devise (methods of improvement). Whenever possible, have learners apply skills to authentic tasks. Authenticity is critical to developing their abilities to perform the skills in contexts beyond the class. To help learners list elements, instructors might consider methods that facilitate memorization such as flashcards, crossword puzzles, or the game Jeopardy (Benek-Rivera & Mathews, 2004). To assist learners in applying the elements of theory, instructors could use methods such as mini-case analysis, structured field observations, and buzz groups (Svinicki & McKeachie, 2010). To aid in devising interventions to improve employee motivation and productivity, learners will need to synthesize and evaluate information. Instructional methods such as problem solving, critical incident technique, or simulations facilitate higher order skills (Hermanowicz, 1961; Lee & Caffarella, 1994). As with information sharing and illustration of skills, methods of guiding practice can be adapted to online learning, which we investigate during the interaction and media phases of redesign. Next we explore ways to nurture progress, determining how well learners have achieved the objective and facilitating future application of skills outside of the classroom.

Nurture Progress The final set of decisions in the DeSIGN process focus on skill assessment and encouraging future application of skills in post-requisite courses or on the job. To nurture progress, we will consider evaluative methods as well as

41

Design Considerations

methods that will aid in generalization of skills from this course to other contexts. Methods should allow instructors to determine whether students can perform the skills they’ve practiced at the levels of performance specified in the objective. Using our example objective, the goal was for learners to list elements of eight theories with 80% accuracy, to correctly apply the elements of a single theory to a given situation, and based on the resulting analysis devise methods that would predictably improve employee morale and productivity. The presumption being, if students can do this with one theory, they can learn to do it with the others. In order to check on mastery of the elements of theory, instructors would likely employ an evaluative method such as an objective examination and look for learners to earn 80% or better. If learners do not achieve this score, remedial instruction could be implemented and the exam administered again. To check on accuracy in application of a single theory to a given situation, instructors could have learners share field observations along with their theoretical analyses of them in groups or as a formal presentation. Learners could use peer and instructor feedback to refine their analyses. To assess the likelihood the interventions devised would improve employee motivation and productivity, instructors could rely on a larger case-study project, branching simulation, or application questions on an examination (Gordon, 2004). Decisions about nurturing progress will influence interaction and media choices as well as having strong implications for evaluation. In Chapter 6, we further investigate evaluation of learning as well as issues of evaluation unique to online environments.

CONCLUSION There are a number of key considerations related to instructional design involved in the course redesign process. Many of these decisions are part of good instructional design, regardless of the delivery mode. We advocate for making design decisions in advance of any related to technology; media choices should wait until after you know what you wish to use them to accomplish. We begin the DeSIGN process after selecting our course metaphor. We analyze the learning needs of our target audience. Reflectively, we set course objectives that are observable, measurable, attainable, and specific. We ensure the objectives fit with our metaphor as well as our target

42

REDESIGNING COURSES FOR ONLINE DELIVERY

audience. We also look to our larger program of study to determine fit with skills coming from prerequisite courses or to be developed for postrequisite courses. Determining our learning objectives is the most crucial step in the process. Unclear objectives will lead to fuzzy rationale surrounding selection of instructional method, which impacts later decisions related to interaction, media, and evaluation. This chapter focused on design, the first of four-course redesign phases featured in the DIME model. Building upon the decisions made here, we turn our attention to considerations related to interaction. Specifically, we explore learner interactions with content, with the instructor, and with peers. We investigate the ways each type of interaction affects the learning experience and influences the course redesign process.

NOTES 1. DeSIGN is an instructional design process previously developed by the author that structures the setting of objectives and the selection of instructional methods. 2. Declarative goals relate to mastery of terms and content knowledge. Procedural goals relate to application. 3. Locke’s goal setting theory proposes increased motivation when goals are challenging but achievable (Locke & Latham, 2006).

REFERENCES Aragon, S. R., & Johnson, E. S. (2008). Factors influencing completion and non completion of community college online courses. The American Journal of Distance Education, 22(3), 146 158. Arnold, W. E., & McClure, L. (1995). Communication training and development (2nd ed.). Long Grove, IL: Waveland. Aronson, E. (2013). The jigsaw classroom. Retrieved from http://www.jigsaw.org. Accessed on February 25, 2013. Benek-Rivera, J., & Mathews, V. E. (2004). Active learning with jeopardy: Students ask the questions. Journal of Management Education, 28, 104 118. Blau, I., & Barak, A. (2012). How do personality, synchronous media, and discussion topic affect participation? Technology & Society, 15(2), 12 24. Bloom, B. S., Engelhart, M. D., Furst, E. J., Hill, W. H., & Krathwohol, D. R. (1956). Taxonomy of educational objectives: The classification of educational goals, Handbook I: Cognitive domain. New York, NY: Longman-Green. DiBiase, D., & Kidwai, K. (2010). Wasted on the young? Comparing performance and attitudes of younger and older US adults in an online class on geographic information. Journal of Geography in Higher Education, 34(3), 299 326.

Design Considerations

43

Eddy, J. M., Donahue, R. E., & Chaney, J. D. (2001). A contextual relative approach to designing a master’s program in health education using distance education technologies. The International Electronic Journal of Health Education, 4, 377 384. Gordon, A. S. (2004). Authoring branching storylines for training applications. In Y. B. Kafai, W. A. Sandoval, N. Enyedy, A. S. Nixon, & F. Herrera (Eds.) Proceedings of the sixth international conference of the learning sciences, (230 237). Mahwah, NJ: Lawrence Erlbaum. Gosling, S. D., Augustine, A. A., Vazire, S., Holtzman, N., & Gaddis, S. (2011). Manifestations of personality in online social networks: Self-reported Facebook-related behaviors and observable profile information. Cyberpsychology, Behavior, and Social Networking, 14(9), 483 488. Harrington, R., & Loffredo, D. A. (2010). MBTI personality type and other factors that relate to preference for online versus face-to-face instruction. Internet and Higher Education, 13, 89 95. Heffner, M., & Cohen, S. H. (2005). Evaluating student use of web-based course material. Journal of Instructional Psychology, 32(1), 74 81. Hermanowicz, H. J. (1961). A critical look at: Problem solving as teaching method. Educational Leadership, 18, 299 306. Hoskins, S., & van Hoof, J. (2005). Motivation and ability: Which students use on-line learning and what influence does it have on their achievement? British Journal of Educational Technology, 36(2), 177 192. Jost, B., Rude-Parker, C., & Githens, R. P. (2012). Academic performance, age, gender, and ethnicity in online courses delivered by two year colleges. Community College Journal of Research and Practice, 36, 656 669. Knowles, M. S., Holton, E. F., & Swanson, R. A. (2005). The adult learner (6th ed.), Amsterdam: Elsevier. Kolb, D. A. (1981). Learning styles and disciplinary differences. In A. W. Chickering (Ed.), The modern American college: Responding to the new realities of diverse students and society (pp. 232 255). San Francisco, CA: Jossey-Bass. Lawless, C. J., & Richardson, J. T. E. (2002). Approaches to studying and perceptions of academic quality in distance education. Higher Education, 44, 257 282. Lee, P., & Caffarella, R. S. (1994). Methods and techniques for engaging learners in experiential learning activities. New Directions for Adult and Continuing Education, 62, 43 54. Leeds, E., Campbell, S., Baker, H., Ali, R., Brawley, D., & Crisp, J. (2013). The impact of student retention strategies: An empirical study. International Journal of Management in Education, 7(1 2), 22 43. Levy, Y. (2004). Comparing dropouts and persistence in e-learning courses. Computers & Education, 48, 185 204. Locke, E. A., & Latham, G. P. (2006). New directions in goals-setting theory. Current Directions in Psychological Science, 15(5), 265 268. Nash, R. D. (2005). Course completion rates among distance learners: Identifying possible methods to improve retention. Online Journal of Distance Learning Administration, 3(4). Retrieved from http://www.westga.edu/%7Edistance/ojdla/winter84/nash84.htm. Accessed on January 24, 2013. Nichols, A. J., & Levy, Y. (2009). Empirical assessment of college student-athletes’ persistence on e-learning courses: A case study of a U.S. National Association of Intercollegiate Athletes (NAIA) institution. Internet and Higher Education, 12(1), 14 25.

44

REDESIGNING COURSES FOR ONLINE DELIVERY

Ogbu, J. U. (2003). Black American students in an affluent suburb: A study of academic disengagement. Mahwah, NJ: Erlbaum. Parker, R. E., Bianchi, A., & Cheah, T. (2008). Exploring student and faculty perceptions of technology in education. Education, Technology & Society, 11(2), 274 293. Parker, R. E. (2005, January). Design made simple: Leadership training and development for managers. Presentation to the annual alumni conference of Leadership Portage County, Rootstown, OH. Perry, W. G. (1999). Forms of ethical and intellectual development in the college years: A scheme. San Francisco, CA: Wiley. Pierrakeas, C., Xenos, M., Panagiotakopoulos, C., & Vergidis, D. (2004). A comparative study of dropout rates and causes for two different distance education courses. International Review of Research in Open and Distance Learning, 5(2). Retrieved from http://www.irrodl. org/index.php/irrodl/article/view/183/804 Richardson, J. T. E. (2003). Approaches to studying and perceptions of academic quality in a short web-based course. British Journal of Educational Technology, 34, 433 442. Richardson, J. T. E. (2005). Students’ perceptions of academic quality and approaches to studying in distance education. British Educational Research Journal, 31, 7 27. Richardson, J. T. E. (2006). Investigating the relationship between variations in students’ perceptions of their academic environment and variations in study behavior in distance education. British Journal of Educational Psychology, 76, 867 893. Richardson, J. T. E. (2012). The attainment of white and ethnic minority students in distance education. Assessment & Evaluation in Higher Education, 37(4), 393 408. Severiens, S., & ten Dam, G. T. M. (1994). Gender differences in learning styles: A narrative review and quantitative meta-analysis. Higher Education, 27(4), 487 501. Sitzmann, T., Brown, K., Ely, K., Kraiger, K., & Wisher, R.. (2009). A cyclical model of motivational constructs in web-based courses. Military Psychology, 21, 534 551. Sitzmann, T., Kraiger, K., Stewart, D., & Wisher, R.. (2006). The comparative effectiveness of web-based and classroom instruction: A meta-analysis. Personnel Psychology, 59, 623 664. Svinicki, M., & McKeachie, W. J. (2010). Teaching tips: Strategies, research and theory for college and university teachers (13th ed.), Boston, MA: Cengage. Wojciechowski, A., & Palmer, L. B. (2005). Individual student characteristics: Can any be predictors of success in online classes? Online Journal of Distance Learning Administration, 8(2). Retrieved from http://www.westga.edu/∼distance/ojdla/summer82/wojciechowski82. htm. Accessed on July 26, 2013.

CHAPTER 4 INTERACTION CONSIDERATIONS

ABSTRACT Learner-centered interactions determine the look and feel of online courses, influencing the way learners experience them. In this chapter we investigate considerations related to three types of interactions: learner content, learner instructor, and learner learner. Learners interact with content through the course structure and layout. They also interact with peers who may be cast in the role of community members, there to provide social support, or they may be more prominently cast as information providers and/or collaborators. The learner is at the center of both content and peer interactions. Instructor interactions set expectations for learners and facilitate learner interactions with content and peers. Instructors are instrumental forces in bringing about connections between learners, enabling the social presence necessary for collaboration. Instructor interaction may also be relational, enabling individualized connections between learners and the instructor. Redesign decisions center on creating a course structure that fits the learner and content and results in a satisfying course experience. We use the power of metaphor to bring into focus the most relevant considerations. In the end, we illustrate the redesign of a single course through the lens of three separate metaphors to demonstrate how metaphor shapes the process, bringing together design and interaction decisions to create unique and elegant course designs. Keywords: Course redesign; online interaction; online teaching and learning; social presence; immediacy behaviors; online education.

45

46

REDESIGNING COURSES FOR ONLINE DELIVERY

INTRODUCTION Now that we’ve made some initial decisions about our learning objectives and instructional methods, we’re ready to attend to the next set of considerations in course redesign, those related to interaction. We’re still not ready to consider specific technologies; discipline is needed as we redesign our courses. As previously described, the DIME model separates course redesign into four prime areas of consideration: (1) design, (2) interaction, (3) media, and (4) evaluation. This model structures the course redesign process by providing a framework for decision-making. We explore the elements of the model separately in this volume, but they are interdependent and not completely distinct from one another. Therefore, decisions related to each phase of course redesign are iterative and ongoing. Decisions related to interaction will strongly influence the type of experience learners and instructors have within a course. For instance, in traditional classrooms, research has established a strong relationship between student perceptions of learning and teacher immediacy behaviors (Witt, Wheeless, & Allen, 2004). Teacher immediacy behaviors consist of nonverbal cues like maintaining eye contact, expressive gestures, standing near students, and verbal cues such as affirming comments and self-disclosure. These behaviors help motivate students to engage in course activities with a number of positive outcomes such as increased student motivation to learn (Allen, Witt, & Wheeless, 2006). Interactions that have the potential to increase motivation to learn would be particularly helpful in required courses in higher education. Required courses are classes students must enroll in, but often have no particular interest in taking. In this case, the intrinsic motivation associated with interest in the subject would be absent, at least at the start (Leong, 2011). Past research suggests that teacher immediacy behaviors help students feel connected in the classroom, which has been found to affect both levels of satisfaction and rates of course completion, constructs closely related to motivation (Leach & Zepke, 2010). Once courses move online, immediacy behaviors become mediated. As instructors, we lose access to the nonverbal cues that signal to us whether students understand or are confused. Learners also miss out on the spontaneous and vicarious cues provided by other learners. To compensate for these losses, interaction must be more carefully planned and provided for in course redesign. Interaction becomes more instrumental when we move courses online.

Interaction Considerations

47

In the remainder of the chapter, we explore the three types of learner-centered interactions important to course redesign: interaction with content, interaction with instructor, and interaction with peers. We highlight the essential role instructors play in facilitating learner interactions with content and peers; interaction, while learner-centered, is instructor driven. At the end of the chapter, we revisit the power of the metaphor in shaping various design and interaction decisions, bringing about a unified course concept.

TYPES OF LEARNER-CENTERED INTERACTIONS Michael Moore first proposed the three types of interaction we explore. Moore (1989) argued that in online classes learners interacted first and foremost with the content and then also with the instructor and other learners. Briefly, interaction with content is the ability of learners to access, manipulate, synthesize, and communicate content information. Learner content interaction involves redesign decisions related to the overall structure and layout of a course. This includes decisions about navigation, consistency, learning activities, and elements of learner control. It also includes decisions about interaction timing, such as synchronous vs. asynchronous interaction (Janicki & Liegle, 2001; Swan, 2002).1 Interaction with instructor is the ability for learners to communicate with, and receive feedback from, the instructor. Learner instructor interaction involves redesign decisions related to communicating expectations, providing feedback, establishing social presence,2 and facilitating learner engagement (Swan & Shih, 2005). This interaction is also relational in nature. It involves inter-subjective communication that can be maintained even at a physical distance (Giuseppe & Galimberti, 1998). In other words, learners and instructors are in an interpersonal relationship, the quality of which is subject to the desires, perceptions, and actions of both parties. Interaction with peers is the ability for learners to communicate with each other about content and to create an active learning community. This interaction may be relational, depending upon its quality and character. Redesign decisions related to learner learner interaction will largely depend upon the roles peers are cast in relative to the learning process. Peers may play a central role, as they would in collaborative learning. Or, they may be tangential, or completely absent, as they would be in a course designed for independent learning such as computer-based training.

48

REDESIGNING COURSES FOR ONLINE DELIVERY

Whatever the role of peer learners, it should be explicitly defined and communicated to students. Decisions made about the role will likely be influenced by instructor views about the nature of learning, whether it is mostly cognitive or social. Instructors who see learning as primarily a cognitive pursuit will likely cast peers in a minor role. Those who see learning as essentially a social endeavor will likely cast peers in a larger role. This fundamental choice will then inform more specific redesign decisions about facilitating social presence, participation, and using synchronous vs. asynchronous interaction methods. All three types of learner-centered interactions should be considered during course redesign, as all three play a role in how learners experience online courses. Swan (2002) examined students’ perceptions of learning along with their levels of satisfaction in a study of 73 online courses. She found three factors to be strongly related to both. They were: clarity and consistency of course design, contact with and feedback from instructors, and active and valued discussion. In other words, student perceptions of learning and satisfaction were influenced by their interactions with the content, with the instructor, and with their peers. How important each type of interaction will be to your course will vary based upon learning objectives and instructional methods. We begin by examining each type of interaction in more detail along with its most relevant redesign decisions.

Learner Interaction with Content Decisions about the how learners should interact with course content will be informed by the chosen learning objectives and instructional methods. Moore (1989) suggested that it is learner content interaction that makes the experience educational. In other words, Moore sees content as the driver of the learning process. As Moore further pointed out, “some learning programs are solely content-interactive in nature” (p. 101). In early distance education programs (i.e., correspondence courses), learners received materials, read and practiced with content, and then submitted an examination by mail. Today, massively open online classes (commonly referred to as MOOCs), similarly privilege interaction with content. Learners enroll in the course and move through materials with little or no required interaction with others. This structure may or may not be appropriate for your course, given your content and your target audience. As we’ve said before, there is not one best way to redesign a course.

Interaction Considerations

49

Rather, the DIME model encourages reflection on the considerations, so you can make the choices that best fit your situation. Fitting Course Structure to Learner and Content The layout and structure of your course determine the way information will be organized and presented and how learning activities will be conducted. These decisions may be retroactively affected by the capabilities of a learning management system (LMS) that you choose, or may be required to use, but for now we want to create a vision of how learners should experience the course. Later, during the media phase of redesign, we will find a means for the technology to enable that vision. Resist defaulting to a layout developed around the number of topics to be covered or the number of weeks in a course. These “bucket-models” are well supported by an LMS, but they may not be the best way to engage your learners with your content. To get learners actively interacting with content, create a course layout that helps learners reach course goals. By way of example, we revisit the course objective first introduced during the design phase. Sample Objective Learners will demonstrate their knowledge of eight theories of motivation by listing their elements with 80% accuracy and applying all of the elements of a single theory to devise methods to improve employee morale and productivity in a given situation.

To develop the course layout, we return to decisions made about the general methods of instruction to be used in accomplishing the objective. Specifically, we chose the information to share, the skills to illustrate, the means by which to guide learners in practicing the skills, as well as how to nurture their future progress (Parker, 2005). To illustrate the intersection of design decisions with content interaction decisions, we’ll focus on the information sharing aspect of the sample objective: learners will demonstrate their knowledge of eight theories of motivation by listing their elements. During the design phase, we decided to share information about the elements of eight specific theories, which could be accomplished through assigned course readings and/or lecture. Specific technologies to use in the delivery will be made later, during the media phase. First, we need to determine how learners will interact with the information shared by creating a course layout that makes it easy for learners to find the material and to navigate to other course elements. There are a number of ways the course could be structured. One way is to set the course up around the eight theories, each theory in its own

50

REDESIGNING COURSES FOR ONLINE DELIVERY

content bucket. This structure works well if learners won’t need to integrate information across theories. Given the remainder of our objective (applying a single theory to devise interventions), this could work. However, there may be better ways to organize. Research has shown that fewer, well-structured modules work better in terms of easing learner navigation (Swan, 2002). Another way we could structure the course is by organizing the theories into modules based upon the theoretical perspective each represents. The layout would group the “needs theories” in one module, “behavioral modification” theories in another, and “experiencedtreatment” theories in a third. This structure would reduce the number of modules from eight to three and enable deeper levels of learner engagement with the material. By grouping theories together, learners could compare and contrast perspectives, which may better inform the application of theory. There are still other ways to organize that could be explored. Information about all of the theories could be located in one place with course modules centering on learning activities, rather than information shared. In one module, learners could engage in activities that result in their ability to accurately list theory elements. In another, they could apply various theories to scenarios to determine the motivational issues. In yet another, they could devise methods for improving employee morale. Whatever layout you choose, it should fit your content, your learners, and your instructional style. As previously discussed, your views about that fit are likely reflected in your chosen course metaphor. Later in the chapter we look at more detailed examples of how metaphor, design, and interaction decisions come together to create a unified course layout. First, we consider some additional research that should inform learner content interaction decisions. Additional Content Structure Considerations At the start of your course, learner content interactions will increase the cognitive load of your students. In any class, there is a learning curve associated with new content, but when you move your course online, your learners will also have to become familiar with the way the course operates. And, they must do so without the benefit of the vicarious clues available in traditional classrooms. In the beginning, learners must figure out where to find things, how to participate in activities, and what is expected of them in terms of participation, all while being introduced to new subjects. Together, these tasks put a drag on learners’ ability to process information and perform.

Interaction Considerations

51

Learners will need time to let the course structure become second nature before they are ready to truly engage with the subject. Research on multitasking and the brain suggests we are only capable of multitasking if one of those tasks is fully automatic (Doyle, 2011). For example, when we first learn to drive, it is very difficult to drive and hold a conversation with a passenger. Our cognitive energies are aimed at reading signs, obeying speed limits, and estimating the impact of the actions of other drivers. With experience, these activities become more automatic. Now we can drive and talk without experiencing the stress of cognitive overload. Similarly, the navigation of a course will not be automatic until it becomes familiar and predictable. Simplicity and redundancy work best to facilitate student navigation through online courses. Designing each of the modules with similar components and clear instructions helps to make navigation an automatic task, freeing learners’ cognitive energy for engaging with the content itself. Research supports this approach. Sheridan, Kelly, and Bentz (2013), surveyed 181 undergraduate students enrolled in online courses at two large, Midwestern universities in the United States to find out what instructor actions were most important to them. In the top 10 were: creates a course that is easy to navigate, provides clear instructions on how to participate in course learning activities, and clearly communicates important due dates and time frames for learning activities. One of the greatest opportunities in redesigning courses for online delivery is the chance to rethink the interactions learners will have with content. The redesign process can enable content interactions not previously considered. For example, students could move through the course at different speeds or complete different assignments based upon their individual learning needs. Content interactions might also be the product of learners’ own choices (learner control) or they might be dictated by learners’ performance (i.e., machine scoring that leads to a set of personalized learning activities). Decisions related to all three types of learner-centered interactions affect the way learners experience the course, perhaps more than any other set of considerations. It is through interaction, whether with content, instructor, or peers that teaching and learning come to life. Next we explore considerations related to learner instructor interaction.

Learner Interaction with Instructor If interaction with content is the defining characteristic of education, as Moore (1989) suggested, it is learner instructor interactions that are

52

REDESIGNING COURSES FOR ONLINE DELIVERY

essential to the art of teaching. Decisions about instructor interaction will have consequences for the way both learners and instructors experience a course. However, instructor and student views of an experience do not always align. In a study of 485 instructors and 3,145 students at a large Midwestern university in the United States, students and instructors had varied perceptions about the effects of classroom technologies on learner connections with the instructor and peers. They also varied in their views of the effects technology use had on student behaviors such as class attendance and discussion participation (Parker, Bianchi, & Cheah, 2008). Instructors believed classroom technologies enhanced learner instructor and learner learner relationships more than students did. Students believed the technology encouraged their attendance, participation, and concept memory more than instructors did. Perceptions often differ from reality, but they represent the way individuals experience and interact with their world. Left on their own, learners may interpret the meaning behind learning activities to be different than the instructor envisioned. Learner instructor interactions are those that help facilitate the learning experience as it was intended. Online interactions may be learner-centered, but they are instructor driven. Learners need guidance from the instructor about how to interact with content, with peers, and with the instructor. Even if interaction doesn’t involve direct access to the instructor or peers, instructors need to set those expectations. Without learner instructor interaction, the teaching aspects of online teaching and learning are absent. Instructor interactions can be organized into three prime categories: administrative interactions, facilitative interactions, and relational interactions. Administrative interactions help set expectations for participating in the course, facilitative interactions support learner interactions with content and with other learners, relational interactions focus on communication that results in a mutually satisfying interpersonal relationship between learner and instructor. Administrative Interactions In online courses, we set expectations similar to those set in a traditional face-to-face class. For instance, it is common practice for instructors to set expectations about assignments such as due dates and performance standards. These types of expectations are often included in course syllabi where they are explicitly expressed. Other expectations are shared less overtly in face-to-face classes. For instance, indications about whether it is considered rude to speak out in class, as well as rules for taking turns

Interaction Considerations

53

speaking, are implicitly set through instructors’ regulating classroom discussions using eye contact and gestures. The physical setup of the learning space also provides clues about expected behaviors. When courses move online, however, we lose many of the nonverbal and vicarious cues used to regulate learner behavior. This creates uncertainty that can lead to improper behaviors. Behavioral and performance expectations need to be overtly set to ensure learners know what to do. For instance, if instructors expect learners to participate in discussions, they will need to not only tell learners how to participate (i.e., post to the discussion board), but they will need to indicate that it is an important behavior by monitoring it and providing learners with feedback about it. Grading rubrics that delineate between levels of performance are useful in communicating these types of expectations in a way that learners understand. Another set of expectations that should be made explicit to learners is the availability of the instructor. Precise descriptions of when and how to interact with the instructor, in terms of communication channels to use, amount of time to wait for a response, as well as the types of information the instructor will and will not provide, help to reduce learner uncertainty. Students seem to be able to deal with whatever the schedules are, so long as they know what to expect. This has held true in my own experience and in the experiences of other instructors whom I’ve encountered. For instance, during a panel discussion at the Association for Writing & Writing Programs conference held in Boston, Massachusetts, novelist and online creative writing instructor, A. J. Verdelle, spoke directly to this point. She tells students to send her a note anytime, but to expect a response during her next scheduled office time. So if her office hours3 are on Tuesdays and Thursdays from 2 p.m. to 4 p.m., and a student sends a note on Friday, they should expect to receive a response the following Tuesday (personal communication, August 3, 2013). In my own experience, setting expectations for how available I will be electronically has also enabled my students to set parameters for their own participation. I generally do not log into the course on weekends. My students know that they should contact me by 5 p.m. on Friday if they need my help on work they plan to do over the weekend. If they email on Saturday, they know I’ll respond on Monday morning. They also know I will not post additional material or request responses from them over weekends, freeing them to be offline as well. Setting explicit expectations helps both learners and instructors better manage their time and better meet one another’s expectations. Next we

54

REDESIGNING COURSES FOR ONLINE DELIVERY

turn to the facilitative interactions that instructors must engage in to bring about the planned course experience. Facilitative Interactions Facilitative interactions are learner instructor interactions specifically designed to bring about learner engagement. The interactions might concentrate on engaging learners with the content, engaging learners with each other, or both. One way instructors facilitate content interaction is through demonstrating the level of engagement they expect of students. Instructors should model the expected frequency, timeliness, and character of interactions, whether with content or with peers (Ruberg, Moore, & Taylor, 1996). Consider your vision for how interactions will take place in your course. Do you envision students mostly interacting with you, the instructor? I call this the private lesson model. Alternately, you could have students interact almost exclusively with content in a kind of correspondence course model. Or, you could cast peers as key information resources, using a constructivist inspired model like the community of inquiry. This model assumes that learning is a social, collaborative process (Garrison, 2013). Whatever the intended model, it should be specifically chosen, explicitly articulated and demonstrated for your learners. If the instructor fails to facilitate the desired interactions, learners are unlikely to arrive at them on their own. For instance, in one of my first attempts to redesign a course for online delivery, I had students submit preliminary ideas about an assignment to a discussion board. My intent was to get the students “talking” to each other about the assignment before undertaking it for themselves. This was similar to a process I used in my face-to-face classes. I did not specifically set the expectation that learners were to respond to the submissions of others; few students responded. Instead of generating a class discussion as planned, conversations took place primarily between individual students and me. Inadvertently, the demonstrated activity became: a student posts, I respond. This created a model for student learning that was different from what I had envisioned. For the remainder of that class, learners turned only to me for information. If students had questions about anything from assignment due dates to effectiveness of revisions or to where to find course materials, they sent a message to me and waited for an answer. Although this approach worked, it didn’t work as intended and the burden of responding quickly became overwhelming. In retrospect, the root of the problem was in hoping that students would build a relationship with each other to learn collaboratively, but failing to

Interaction Considerations

55

facilitate the necessary learner learner interactions to accomplish it. If you want students to interact with one another, you must facilitate it (Picciano, 1998). Begin by setting expectations for peer interaction and making it part of class performance (Hawisher & Pemberton, 1997). If you’re going to expect participation, you have to design for it and make it something students are accountable for. It won’t just happen. Place value on discussion by providing students with feedback (either through grades or other means) about how well they are meeting class expectations. Facilitating online discussions can be challenging. As I discovered, there are times the instructor should wait to participate in the discussion. If you weigh in too soon, you risk shutting down the class discussion. However, if you wait too long to comment, learners may misinterpret your silence as disinterest. Hosler and Arend (2013) compare instructor “wait time” in face-to-face and online contexts. They suggest, in the classroom, an instructor should wait at least 3 seconds after asking a question to comment further, even longer for higher order questions. In contrast, they say, online wait time could be a matter of days. They suggest not weighing in on every post or answering every question, because once the authority of the instructor speaks, the discussion ends. Collison, Elbaum, Haavind, and Tinker (2000) suggest instructors should occasionally post general comments to the whole class that reference specific student posts, summarize, and refocus the discussion. According to Arend (2009) facilitation should be less frequent in online classes, but more purposeful. Instructors will also want to facilitate appropriate kinds of interactions by establishing a safe environment in which differences of opinion can be expressed. This will be particularly important if peer interactions are important to the learning process. Weiss (2000) recommends explicitly instructing students in the way they should approach peer interactions, reminding them that there are real people on the receiving end of their messages. Unlike the face-to-face classroom, the instructor cannot shape and direct interactions as they unfold. Interaction management needs to be proactive and unambiguous. If learner learner interactions are to play a key role in a course, peer relationships will also need to be facilitated by the instructor. One approach is to put elements in place to bring about social presence. Short, Williams, and Christie (1976) defined this concept as the “psychological closeness” enabled by a particular communication channel. More recent definitions of social presence move learners to the center, recognizing that their behavior, as much or more than the channel used, influences how close they feel. For example, Garrison, Anderson, and Archer (1999)

56

REDESIGNING COURSES FOR ONLINE DELIVERY

defined social presence as “the ability of participants to project themselves socially and emotionally, as “real” people through the medium of communication being used” (p. 90). Instructors also play a role in bringing about social presence. There are things they can do to facilitate it. One simple way is to have students post a brief personal history or profile for others to read, and to also post one of your own (Aragon, 2003). To make this even more effective, create a class activity that uses the contents of peer profiles. This should inspire learner connections by revealing similarities in backgrounds and attitudes, encouraging interaction and relationship building (Burleson, Samter, & Lucchetti, 1992; Sunnafrank & Andersen, 1991). Aiding learners in projecting themselves to their peers, and then finding things they have in common with others, stimulates feelings of social presence. Another way to facilitate social presence, and learner learner interaction by extension, is to model the tone and style of interactions indicative of respect and interest. For instance, instructors may want to address learners by name, rather than writing to the class in general. Another way to express social presence is to quote directly from the posts of individual learners, extending their ideas or asking questions. By modeling specificity in posts, and using a supportive tone, instructors inspire learners to do the same (Stacey, 2002; Vrasidas & McIssac, 2000). If learners do not use the desired interaction style, instructors could overtly deconstruct their own posts, pointing out the elements that should be emulated by learners. They could also provide feedback to individual learners, off-line, about how they might interact more effectively. Instructors will also want to consider social presence when they make media choices in the next phase of course redesign. One media feature that seems to help in facilitating interaction between learners is being aware that others are online at the same time. Called presence awareness, it involves a means to indicate that individual learners are online and active in the course space. A familiar technology that features presence awareness is instant messaging (IM). Learners know whether a person is available to chat by way of a status indicator. Studies have shown that students who use IM are more likely to report that it was easier to communicate with peers, and that they felt more like they were part of a learning community, than those who did not use IM (Nicholson, 2002). Social media technologies such as Facebook have presence awareness, which users can turn on and off at will. Some LMSs have presence awareness built in as well. Before we can determine how important social presence will be in our course, we need to determine the significance of peer learners to the

Interaction Considerations

57

learning process. If they will play an important role, the instructor will have to facilitate learner learner interaction. For now, put social presence on the list of considerations. We will further investigate decisions about peer interaction after exploring the final category of learner instructor interactions: relational interactions.

Relational Interactions In the previous section, we explored facilitative interactions. These are the interactions most closely aligned with the concept of teaching presence.4 Teaching presence is learner-centered. It focuses on bringing about cognitive and social presence in support of learner interaction with content and peers. Teaching presence is necessary, but not sufficient, to bring about interpersonal relationships between learners and instructors. In this section we will consider relational interactions and highlight what distinguishes them from facilitative interactions. So far, we have considered our learners in the aggregate. The better we get to know our learners (in general) the better we can design our courses to meet their learning needs (on average). This is a practical approach to the redesign process, especially for higher education courses that attract a variety of learners. But, in order to build personal relationships with students, a desired activity for many instructors, you have to get to know them as individual people and they need to get to know you. Instructors sometimes look at online environments as unable to support this type of relationship building. Although it has been argued that mediating relationships through technology affects their character and quality, authentic relationships are still quite possible (Parker, 2003). Fig. 4.1 depicts the various online interactions differentiating facilitative from relational. When it comes to content and peer interactions, the learner is placed at the center. In the figure, the double-headed arrows between the learner and the other course entities indicate that the learner is interacting directly with content and peers, assuming peers play a role. Notice, however, that the learner and instructor are placed parallel to each other. As discussed in the previous section, instructors facilitate how learners interact with content and peers. The oscillating arrows represent those facilitative interactions. These interactions are carried out through decisions made about the design of the course, such as the learning objectives, instructional methods, and course structure. Instructors also facilitate learner learner interactions through their decisions and behaviors. Providing a means to bring about social presence, setting expectations, and modeling

58

REDESIGNING COURSES FOR ONLINE DELIVERY

Fig. 4.1.

Nature of Online Learner Instructor Interactions.

desired behaviors all help bring about a sense of community within the online class. When we turn our attention away from facilitative interactions and focus on relational interactions, suddenly, the learner is NOT at the center; the instructor isn’t either. Instead, we see their relationship as reciprocal. Relationships are co-created by the learner and instructor. Both parties have input into deciding what kind of relationship they will have. The character and quality of each learner instructor relationship will be unique. The learner and the instructor are both principal players in the interactions that create the relationship. There are things both parties can do to build it. As the instructor, it will be important for you to communicate a vision for, and your openness to, an individual relationship with your students. Learners will perceive your vision and openness via your course design choices and through the behaviors you exhibit. The perspective on relational communication depicted in Fig. 4.1 is reminiscent of the views of American psychologist, Carl Rogers who first proposed the person-centered approach. This approach suggests that in professional contexts, although not to the exclusion of personal settings, the relationship requires four elements to bring about growth: congruence, empathy, positive regard, and the perception by both parties that the other three are in place (Rogers, 1962).

Interaction Considerations

59

Congruence speaks to the idea that both parties see one another as authentic people. Attitudes and feelings expressed are perceived as real, not superficial or as a professional fac¸ade. Empathy involves knowing enough about the other party to understand how she or he sees the world. Positive regard involves having a warm, accepting attitude toward the other party. This helps others feel cared about (Rogers, 1962). Immediacy behaviors help to communicate the presence of congruence, empathy, and positive regard. They are also indicative of being open to a personal relationship. Not all parties will be interested in individual relationships, but for instructors who are, engaging in communication immediacy behaviors helps learners perceive that interest. Immediacy behaviors include; smiles, nods, eye contact, gestures, and vocal variety, all of which communicate caring, enthusiasm, and affirmation (Pitt, Wheeless, & Allen, 2004). Actions such as giving praise, using humor, and sharing personal information are also considered an expression of immediacy (Gorham, 1988). It is possible to demonstrate immediacy behaviors online, but it will require planning that is often not needed in face-to-face contexts. For instance, past research suggests that synchronous course components afford more immediacy than asynchronous communication alone (Pelowski, Frissell, Cabral, & Yu, 2005). Proving ways for same-time interactions to take place can communicate to learners that it is important to you to know them. There are a variety of technologies available that enable synchronous interaction. We will explore some of these during the media phase of redesign. Other ways to demonstrate immediacy online are by providing visual cues that help signal expressiveness, helping your learners get to know you (O’Sullivan, Hunt, & Lippert, 2004). For example, providing audio and/or video instructions and feedback can cultivate perceptions of immediacy, assuming they communicate warmth, genuineness, and encouragement (Cornelius-White, 2007). We can also communicate congruence, empathy, and positive regard through the words we choose. Gorham’s (1988) verbal immediacy scale includes behaviors such as recognizing individual learners and encouraging their ideas and viewpoints, communicating your willingness to engage in one-to-one interactions, and appearing more genuine through humor and self-disclosure. With attention and planning, immediacy can be accomplished in online contexts. Ultimately, the learner instructor relationship will be co-created. Relationships are reciprocal in nature; students are not passive participants. Relational interactions between learners and instructors are not merely

60

REDESIGNING COURSES FOR ONLINE DELIVERY

facilitated by the instructor. The instructor crafts these interactions along with each student, constructing their relationship together. Learners interested in building a personal relationship with the instructor will likely respond to instructor immediacy with immediacy behaviors of their own. A key redesign decision will be in electing the relative importance of the learner instructor relationship. It may be a key element to which you extend considerable effort and seek media to enable. Or, learner instructor interactions may be cast in subordination to other types, privileging peer or content interactions instead. As always, redesign decisions should be made with the content, the learner, and the instructor in mind. Before we explore how these decisions might play out in our redesign, we need to revisit the idea of peer interactions and investigate the potential roles they play in the learning process.

Learner Interaction with Other Learners Online learning can feel isolating, a likely contributor to increased attrition rates. In an exploratory study at a large Midwestern university in the United States, we set out to examine student expectations of online courses. Students were asked a series of questions about their expected experiences; questions explored interactions with content, instructor, and peers. Of the 223 undergraduate students that responded, 37% indicated they would avoid taking classes online due to a lack of interaction. More than 36% of those responses specifically indicated concerns about not having access to peers (Parker & Child, 2009). The learning community model has been shown to facilitate interest and action in online courses (Palloff & Pratt, 1999). Learning communities will develop only if peers interact authentically with each other and relationships begin to develop. Instructors can help to facilitate the process. Not every course needs to have heavy peer interactions, though. Learners may be cast primarily in the role of social support, information resources, collaborative partners, or simply co-attendees. Whichever you choose, you will want to develop a plan to facilitate the fulfillment of those roles. Look back at your chosen metaphor; it likely reflects your thoughts about the role peers will play within your course. There are considerations associated with each of the peer roles. For the purpose of discussion, the roles are organized into three nonexclusive categories that cast peers as: community members, information resources, and/or collaborators.

Interaction Considerations

61

Community Members There are many good reasons to facilitate the building of a learning community within an online class. Constructivist theory posits that learning is a social process, done best in community (Garrison, 2013; Palloff & Pratt, 2005). In a study of 294 undergraduates enrolled in online classes, Leong (2011) found social presence, cognitive absorption, and interest to be predictors of student satisfaction with the online environment. Cognitive absorption refers to strong engagement in a learning activity. In several studies, satisfaction has been found to predict student perceptions of learning (see Eom & Wen, 2006; Swan, 2002). Building a learning community can also help generate the desire to participate. As previously suggested, this may be particularly important in higher education where students are required to take courses in subjects outside of their chosen major. Feeling a part of things can stimulate actions that may stimulate interest, all of which predict student perceptions of learning and overall satisfaction (Garrison, Anderson, & Archer, 2001; Swan, et al, 2000). In his research on interaction within online courses, Picciano (2002) argues, “The term ‘community’ is related to presence and refers to a group of individuals who belong to a social unit such as students in a class” (p. 22). As previously discussed, instructor facilitation of social presence will be essential to building a sense of community in online courses. Instructor interactions, both administrative and facilitative, support decisions to cast peers as community members. In the next section, we examine how peers may be cast in a formal role as information providers. This role may be added to that of community member, or may be assigned independently. However, the community member role, along with the learning community that accompanies it, may facilitate increased acceptance of peers as information resources. Information Resources Learners cast in the role of information resources will provide at least part of the course content used by fellow learners. One value of the information resource role is that it forces learners to think more deeply about concepts and ideas in order to find or produce content that is useful to others (Anderson, 2003). Frank Openheimer, American physicist and university professor, seemed to know this intuitively when he uttered the often quoted phrase, “the best way to learn is to teach.”5 Students that come to know enough about a subject to provide information to others, demonstrate two key components of learning: interest and effort. So, one thing to consider before casting students in this role is the

62

REDESIGNING COURSES FOR ONLINE DELIVERY

likely level of their interest in the subject. If the course is in students’ major field of study, and they can see its value, they will be more likely to have the interest needed to motivate effort. Without that effort, students may provide information that is underdeveloped or incorrect, negating the intended benefits of the interaction. A popular means for students to share information in an online environment is through discussion, usually in some asynchronous format. One way to facilitate meaningful participation is to require prework to guide their efforts. Prework may be a reading or writing assignment that spurs their thinking. A short questionnaire or quiz might help evaluate readiness to participate. In this way, interaction with the instructor facilitates student information contribution. Students must see each other as credible sources, so prework and instructor validation help to encourage peer-to-peer information sharing. There are good reasons to use peers as information resources. Past studies have shown that students perform better on concept knowledge questions following discussion, even if none of the peers knew the answers at the start of the discussion (Smith et al., 2009). To get the most out of content-related peer interactions, instructors will want to design activities to ensure that both information seeking and sharing occur strategically. For peers to act and be accepted as information resources, communication needs to be intentional, conscious, and goal-directed. This supports collective production of knowledge. There may be times when you need to move things beyond discussion to more deliberate action. In this case, peers can serve effectively as collaborators, collectively producing both knowledge and artifacts in support of course objectives. Collaborators Collaboration is more involved than the cooperative behavior of sharing information. Collaboration means actually working together, integrating individual efforts with others until the point where it is no longer obvious which teammates did what work in support of learning goals (Parker & Ingram, 2011).6 Collaborative learning requires information sharing and knowledge generation, but it also requires task-oriented and social support behaviors. If peers are cast as collaborators, they will also need to be cast as information providers and community members. True collaboration won’t just happen; instructors will need to facilitate and manage it, especially in an online course environment. Teams perform best when they set shared goals, agree upon how they will work toward them, and have a system in place for holding peers

Interaction Considerations

63

accountable. Instructors can assist in these processes by having teams create contracts, which articulate agreed upon behaviors, and enable peer feedback through mechanisms such as Rate Your Mate (see Parker & Coykendall, 2012). Contracts facilitate expectation setting; team norms become explicit, reducing teammate uncertainty. Contracts set out agreed upon processes such as time schedules for responding to messages from teammates and where members should post new information. There are a number of benefits to collaborative learning. It increases diversity of understanding (Swing & Peterson, 1982) and increases critical thinking (Webb, 1980) with learners working together to develop shared understandings so they can produce results. It also promotes social support behaviors and can make learning more enjoyable (Panitz, 1999). Personal experience in using online teams has yielded benefits for a variety of stakeholders. Learners built connections within the class that were maintained afterward, providing academic support, professional opportunities, and feelings of identity with the university. Online learning teams have challenges associated with them, however. Higher attrition rates in online classes translate to an increased chance that teams will have to be reorganized due to loss of members. This is disruptive to everyone’s learning process. Social loafing or free loading behaviors tend to be exacerbated by the lack of proximity in virtual teams.7 It is much easier to forget or ignore team responsibilities on teams that never come together, especially if social presence is not established. Learners may experience increased uncertainty due to lack of experience in working on a virtual team, increasing cognitive load. Choosing to cast peers as collaborators should be contingent upon the needs of your learners intersected with the course content and your instructional style. All three types of learner-centered interactions (with content, with instructor, and with peers) will likely play a role in your online course; the prominence and character of the various interactions will influence and be influenced by previous redesign choices. We illustrate that process in the next section framed by our course metaphor.

METAPHORICAL REFLECTIONS OF INTERACTION The various considerations involved in redesigning courses for online delivery can seem overwhelming. When faced with too many choices, humans tend to make poor decisions, become dissatisfied with decisions, or become paralyzed and make no decisions at all (Iyengar & Lepper, 2000; Schwartz,

64

REDESIGNING COURSES FOR ONLINE DELIVERY

2004). This is the function of metaphor; it narrows our choices. Used as a lens, it helps to filter out extraneous considerations and focus our attention on the choices most important to our course vision. Metaphor is a lens through which some choices are brought into clear focus, while others are obscured. This is the power of perspective, making the choice of metaphor your most critical redesign decision. Looking at your course through your chosen metaphor, the relevant design and interaction choices will become obvious. In the following examples, you will see the same course approached through three different metaphors, each yielding a different set of decisions in terms of objectives, methods, and interactions that structure the course. We will apply the metaphors to a higher education course in Organizational Communications. We will organize the examples around the elements of the first two redesign phases. After briefly presenting the metaphor, we will provide a list of decisions related to design and interaction in table format (see Tables 4.1 4.3). Within the text, we will highlight how these decisions connect to the metaphor. As we’ve said many times before, metaphor reflects views of how the content, instructor, and students fit together, each creating a unique course redesign. We begin by looking at the Organizational Communication course through the lens of a playground metaphor. Afterward, we will look at the same course through the metaphors of a baseball game and a symphony.8

Application of Playground Metaphor The playground metaphor is well suited to survey style courses, where students are exposed to unfamiliar information and introduced to new skills. We begin our application by deconstructing the metaphor to identify the component parts that will influence our design and interaction decisions. To aid us in this pursuit, Fig. 4.2 depicts the layout of a playground. Brief Description of the Playground Metaphor A playground features different sets of equipment on which children climb, swing, and slide. The pieces of equipment are all separate from one another. Children may run back and forth between the activities, but they cannot play on more than one piece of equipment at a time. Children on playgrounds love to experiment. Sometimes they play with other children, but mostly they play next to them, swinging on the swings or climbing the

65

Interaction Considerations

Fig. 4.2.

Playground Metaphor.

monkey bars, independently, to see how high they can go. Adults monitor the children at play, to keep them safe, but they only intervene when necessary. This suits most children, who prefer to play autonomously. Given that brief description, we use the playground metaphor to bring our redesign considerations into focus. Fitting the Playground Metaphor with DeSIGN Decisions We start the process by making design decisions, just as we did in the previous chapter. Learning objectives and instructional methods fit with learners who are “playing” with new concepts; exposure rather than mastery are implied by the metaphor. Course concepts function as the equipment on a playground, each in its own module, independent of the others. Learners are not expected to use more than one concept at a time (see Table 4.1). Learning Objective. When we set the learning objective in Table 4.1, we limited expectations for learner performance based on the assumption that learners would be new to the material; we envisioned learner experience rather than mastery. We limited the scope of learners’ choices to message content; all other decisions will be made for them.

66

REDESIGNING COURSES FOR ONLINE DELIVERY

Table 4.1.

Course Redesign Using the Playground Metaphor.

DeSIGN stage Objective Learners will gain experience in communicating with customers, managers, and peers by creating messages designed to achieve an assigned purpose for a specific audience using various formats and correct grammar. Share

(1) Display various message formats to familiarize learners (business letters, memos, reports, email, brochures, newsletters) (2) Describe rationale and process of audience analysis to prepare them to match messages to the audience (demographics, size, status, perspectives on topic) (3) Outline steps in the writing process to prepare them for creating messages (organization, research, editing, revision, proofreading) (4) Review of writing mechanics to help them use correct grammar (punctuation, capitalization, spelling, bias-free language)

Illustrate

(1) Explain effective use of each message format (2) Demonstrate audience analysis (3) Compare and contrast effective and ineffective messages (4) Show common writing errors

Guide

(1) Assign practice in writing mechanics (2) Structure tasks related to the writing process (audience analysis task; research task; outline message task; draft and revision task) (3) Develop messages for submission

Nurture

(1) Encourage self-assessment using rubrics (2) Evaluate writing mechanics through auto-feedback mechanism (3) Respond with performance feedback on submissions

Interaction stage Content This type of interaction is most privileged under the playground metaphor. Learners move independently through materials, as they would on a playground. Course is laid out in holistic, self-contained modules that have varied content to hold the attention of newer learners. Careful construction of instructions, explanations, and work samples will be needed. Feedback mechanisms are essential to meeting objectives. Instructor This type of interaction is limited primarily to answering questions, providing evaluative feedback, and monitoring student activity. More specifically, interactions fall primarily in the categories of administrative and facilitative and focus on learner content. Learner

This type of interaction could be nonexistent. Consider casting peers as social support. If peers are used as information resources, limit the scope to activities such as proofreading.

Share Information and Illustrate Skills. In order to achieve the objective, we listed the information to share as part of instructional methods. The learning modules are set up around the four sets of information identified, similar to the layout of the playground. To get learners ready to perform

Interaction Considerations

67

(creating messages), we identified skills to show our learners. These are aligned with the four sets of information we plan to share. Therefore, the skills will be added to the relevant learning module. Guide Practice and Nurture Progress. To build skills, we need to make decisions about how to get learners “doing.” Practice activities to accomplish this end are listed in Table 4.1; learners need to practice the most challenging skills. Modules should be relatively balanced in terms of complexity and workload. Consider time and effort needed to perform each activity. We also listed decisions made to assess student learning and support further applications. Given the autonomy implied by the metaphor, we encourage learner self-assessment using rubrics. The automated feedback fits with both the content and learners. It is anticipated that the instructor would monitor these processes and provide feedback as needed. Fitting the Playground Metaphor with Interaction Decisions The playground metaphor suggests learners will primarily interact with content, just as children interact primarily with the equipment on a playground. This further suggests a course structure of separate, content-specific learning modules and privileges learner content interactions. Table 4.1 describes how each of the three types of interaction might be used. Learner instructor interaction will likely be primarily administrative (expectation setting) and facilitative (of learner interaction with content). Peer interaction is unlikely to play a role, unless as community members. By viewing this course through the lens of the playground metaphor, we reduced the number of considerations to a manageable number. Since the metaphor strongly suggested a layout of independent subject-specific modules, our attention was focused on considerations related to learner content interaction. Next, we redesign this same course using the metaphor of a baseball game.

Application of Baseball Metaphor The baseball metaphor differs from the playground metaphor. It fits naturally with skill development that requires drill and practice. It implies skill improvement to achieve some level of mastery (good, better, best), as we described in Chapter 2. Once again, we begin by identifying the component parts of the metaphor that will influence design and interaction decisions. Fig 4.3 depicts the layout of a baseball diamond.

68

REDESIGNING COURSES FOR ONLINE DELIVERY

Fig. 4.3.

Baseball Metaphor.

Brief Description of the Baseball Metaphor In applying the baseball metaphor, we cast learners in the role of batters. Unlike the playground metaphor, which lends itself to discrete units of content, the baseball metaphor integrates skills/knowledge to complete a task (a hit). The more advanced their skills, the farther learners will progress. As we described when we previously explored the baseball metaphor, hits can advance the runner to first base (a single), second base (a double), third base (a triple), or around all four bases for a “home run.” Learners will be more likely to get to first base early in the course, revising their work to improve and advance to other bases. Later in the course, learners’ skills would be enhanced, enabling doubles, triples, or even home runs for which no revision would be needed. Feedback is featured under this metaphor. In baseball it comes from the performance itself and also from a batting coach. This perspective lends itself to skill building and individualized learning. The best way for a learner to improve is at “batting practice” with someone coaching for improvement. While the learner is part of a team (the class), batting is an individualized skill. As in baseball, where a player’s batting average (total hits per times at bat) and slugging percentage (total number of bases per

Interaction Considerations

69

times at bat) greatly influence earning potential, the learner’s success in batting will influence course performance.9 Given that brief description, we use the metaphor to once again bring redesign considerations into focus. This time the decisions will be different, beginning with the learning objective and on through interaction choices (see Table 4.2). Fitting the Baseball Metaphor with DeSIGN Decisions Learning Objective. The objective, using the lens of a baseball game, differs from the one informed by the playground metaphor. This time, we set some expectations about performance level. Learners are to create effective messages that will succeed in achieving the assigned purpose. Learners’ approach will be strategic, implying they will be building on some knowledge they gained prior to the course. Share Information and Illustrate Skills. The information to be shared is identified and listed in Table 4.2. We see indications of skills learners already possessed given plans to review and reinforce writing skills. The information to be shared does not break out into subject related modules as it did in the playground metaphor. Instead, the information is organized in parts of a process: defining purpose and audience, and applying the best format. Skills identified are more complex than those under the playground metaphor. Learners are to practice applying information to stages in the process of creating a strategic approach to preparing messages. Learning modules will be organized around these processes, reminiscent of running the bases. Guide Practice and Nurture Progress. Practice activities speak to skills improvement. Learners attempt messages, revise, and fill skill gaps with practice materials. The more successful they are in practice, the more advanced their performance. Expectations of improvement indicate the need for coaching from the instructor. The methods for providing individualized feedback are listed. Auto-feedback has been dropped in favor of peer consultation, similar to players on a baseball team. Fitting the Baseball Metaphor with Interaction Decisions The baseball metaphor suggests learners will primarily interact with the instructor, but could also benefit from consulting with peers. The Instructor plays the role of the batting coach and facilitates learner interaction with content. Peers, as members of a team, are cast as community members or possibly information resources. Content interaction is still

70

REDESIGNING COURSES FOR ONLINE DELIVERY

Table 4.2.

Course Redesign Using the Baseball Metaphor.

DeSIGN stage Objective Learners will prepare effective messages to customers, managers, and peers using the appropriate format, correct grammar, and a strategy that successfully achieves the assigned purpose with the identified audience. Share

(1) Explain the need for a clearly defined purpose and apt identification of the audience to achieve it (2) Explain best application for various message formats (3) Describe message elements and strategies for organizing (4) Review writing style (5) Reinforce need for good mechanics

Illustrate

(1) Identify parts of a situational analysis, arriving at the identified purpose (2) Outline key elements of audience identification and analysis (3) Match formats to sample purposes/audiences (4) Integrate purpose and audience to show how they collectively determine the content and tone of the message

Guide

(1) Assign scenarios that state a purpose that should be met. Students are to identify the appropriate audience and construct messages to achieve the assigned ends (2) Compel revisions until message is deemed effective. Then learners move on to the next scenario: good, better, best model (3) Guide learners to practice materials to fill any identified skill gaps: situation, audience, format, writing

Nurture

(1) Assess messages and provide developmental and evaluative feedback (2) Encourage learners to consult others for developmental feedback

Interaction stage Content This type of interaction is important for new information and practice materials designed to fill identified skill gaps. Integration of learning materials into process-oriented modules demonstrates how skills come together in the construction of effective messages. Learners work through the materials at their own pace, drawing from those materials most relevant to their assignments and instructor feedback. Instructor This type of interaction is privileged under the baseball metaphor, as the instructor provides coaching to allow learners to progress. Instructor primarily facilitates learner interaction with content. Relational communication strongly influences satisfaction of both learner and instructor. Administrative interaction, in terms of expectation setting, also plays a role. Learner

This type of interaction is likely limited to casting peers in the role of community members. The sports metaphor lends itself to seeing peers as social supporters (cheerleaders). Peers play a role in providing developmental feedback. The base running model lends itself well to identifying particular learners with the skills to help others.

Interaction Considerations

71

strongly present, but learner application of information is most informed by interaction with the instructor. By viewing this course through the lens of a baseball game, redesign decisions become focused on providing content that would allow learners to practice as well as feedback to help them advance. Learners are seen as part of a community (team), but performance is still an individual endeavor. Interaction with instructors is privileged above learner interaction with content or peers. Next, we redesign this same course one last time using the metaphor of the symphony.

Application of the Symphony Metaphor The symphony metaphor departs from both the playground and baseball metaphors, which focused the learning process on individual learning. The symphony captures the essence of constructivist learning theories that argue knowledge is not a matter of learning objective truths, but something socially constructed. Learners actively create contextualized meanings, usually in community with other learners, tying past experience to new information. It is an interpretive process enhanced through discourse (Delia, 1977; Huang, 2002). So it is with the symphony where musical scores are interpreted and performed collaboratively with each musician and section playing their part with a shared vision of what the music should ultimately sound like. The process moves beyond the mechanics of playing notes to enacting a musical style that brings that vision to life. In an interview with the Los Angeles Times newspaper, distinguished American composer Leis Spratlan describes the symphonic process this way, “By the time you get to the actual concert, you’ve worked out pretty much what you want to do. It’s really a matter of getting 100 musicians to think like one person” (Schultz, 2010). Fig. 4.4 depicts the symphony. Brief Description of the Symphony Metaphor In viewing learners as members of a symphony, we view them as having the building blocks of performance already in place. For instance, members of an orchestra all read music using the same basic rules, an ability they bring with them to rehearsals. Learners would be expected to bring certain skills with them to the learning experience. Musicians also bring their own unique skills and perspectives to the experience, based upon the instruments they play and additional roles they may be assigned (i.e., section

72

REDESIGNING COURSES FOR ONLINE DELIVERY

Fig. 4.4.

Symphony Metaphor.

leader). Similarly, learners will be unique in terms of skills and knowledge based upon academic major and other experiences. In looking at Fig. 4.4, the musicians are separated into sections by instrument type. The implication for course redesign is that learners will be similarly grouped. Just as sections would be expected to meet for practice sessions outside of full orchestra rehearsals, learners will meet in groups. Individual learners will be expected to do some work on their own, as musicians must put in practice hours alone. Learning, as in the performance of a symphony, will be the result of individual and collective efforts integrated into a collaborative creation based upon a shared vision. With that description in mind, let’s use the metaphor to bring course redesign decisions into focus (see Table 4.3). Fitting the Symphony Metaphor with DeSIGN Decisions Learning Objective. The learning objective changes again with application of the symphony metaphor. This lens suggests learners come to the

Interaction Considerations

Table 4.3.

73

Course Redesign Using the Symphony Metaphor.

DeSIGN stage Objective Learners will effectively respond to situations involving customers, managers, and peers, identifying the purpose and audience to be addressed using a format, writing style, and message strategy that would successfully resolve the situation. Share

(1) Describe importance of identifying purpose and audience and researching both (2) Review various formats for messages and how they are best applied (3) Outline steps of message construction, providing clear connections back to purpose

Illustrate

(1) Show sample messages, highlighting how purpose and audience align with information provided and message tone (2) Contrast poorly constructed messages, demonstrating where the errors in approach are most apparent (3) Examine classical examples of message outcomes, highlighting positive and negative consequences

Guide

(1) Reinforce need for appropriate writing style and sound mechanics through assessment and practice (2) Distribute scenarios for learners to collaboratively analyze in order to determine the purpose, audience, and best approach to resolve the situation. Written responses could be authored by individuals or by teams (3) Organize message construction process by setting dates for staged reviews

Nurture

(1) Facilitate class analysis by inviting discussion about other team’s responses to situations (2) Direct discussion of potential consequences for each of the class responses (3) Encourage peer and self-assessment to identify performance gaps

Interaction stage Content This type of interaction is reduced under the symphony metaphor. Much of the learning takes place in interaction with others. Facilitating discussions around processes becomes more essential to the learning process. Content serves primarily as reference material. Some individual activities to practice skills are included. Instructor This type of interaction is primarily facilitative, supporting and guiding peer interactions and activities. Some administrative interaction may be necessary to keep learners on the same page and schedule and in setting expectation. Facilitation of class discussion is essential to class level learning. Learner

This type of interaction is privileged over the others. Peers are cast as collaborators, information resources, and community members. Social presence is needed to facilitate achievement of the learning objectives by teams and the class.

74

REDESIGNING COURSES FOR ONLINE DELIVERY

experience with a foundational set of skills that can be combined with those of other learners to create effective messages. The new objective sets the expectation that learners will develop effective responses to various unstructured situations. Learners will need to interpret the elements of each situation to create a vision for how it should be handled, much as musicians do in a symphony. The instructor plays the role of conductor, facilitating the integration of effort so learners collectively create effective messages. Share Information, Illustrate Skills, Guide Practice, and Nurture Progress. Once again, we planned the instructional methods, but this time, we see increased integration of skills. Information sharing is focused on making connections between interpretive processes such as researching elements of the situation and then integrating them in message construction. Basic course layout is organized around those activities. Skills and practice decisions align with the activities. It is in the support of future learning that we see team outcomes as input to learning in the larger class, as the symphony metaphor implies. Class analysis and discussion are used to estimate the success of each team’s strategy, working out the class’s collective vision of what it is to “respond effectively.” Fitting the Symphony Metaphor with Interaction Decisions The symphony metaphor suggests that peer interaction is privileged as learners are grouped into learning teams that construct messages together. Learner interaction with content is the equivalent of individual practice to prepare for working together with teammates. Peers are cast as collaborators as well as information resources and community members. Class discussion becomes the place where knowledge constructed by the teams is compared with others, deepening overall learning. Instructor interaction is primarily facilitative of learner learner interaction. Illustrating the first two phases of course redesign using three distinct metaphors highlights how powerful the metaphor can be in narrowing the scope of our considerations in making course redesign decisions. Metaphors are not simply a name playground, baseball game, symphony they are the lens that brings focus to our views about the fit between our content, our learners, and our instructional style. In so doing, it structures our thinking, translating our vision into individual decisions that collectively create an elegant course design.

75

Interaction Considerations

CONCLUSION There is much talk in the blogosphere about online education being inferior to the traditional classroom, despite empirical support for its equality and even superiority in enabling student learning. Interaction decisions, more than any other set of considerations, will determine how you and your learners experience your course. While interactions should be student centered, they need to be instructor driven. Instructors need to facilitate learner interactions with content and with other learners. Redesign choices related to layout and course structure, the nature of the instructor learner relationship, and whether peers will be community members, information resources, or collaborators should all support the achievement of learning objectives, in light of the target audience (your learners). The number of considerations related to course redesign can seem overwhelming. Leverage the power of metaphor to your focus attention on those considerations that are most relevant. Once you create a basic vision of your course, you are ready for the third phase of course redesign. The purpose of technology is to enable your vision. In the next chapter, we revisit previous decisions to identify technology needs and investigate a process by which to find media to fill them.

NOTES 1. Synchronous interaction occurs when those involved are acting together in real time. Asynchronous interaction occurs when those involved are acting at independent intervals rather than in real time. 2. Social presence is the perception of psychological closeness. 3. Office hours are commonly used by professors in the United States. They are dedicated hours for student meetings during which students do not need an appointment to meet with the instructor. 4. Teaching presence is one of three presences included in the community of inquiry model; the other two are social presence and cognitive presence. Teaching presence is defined as “the design, facilitation, and direction of cognitive and social processes for the purpose of realizing personally meaningful, and educationally worthwhile, learning outcomes” (Anderson, Rourke, Garrison, & Archer, 2001, p. 5). 5. Quote has been re-quoted enough times that its origins are not verifiable. 6. Much of this thinking related to Collaboration came out of work done as part of the now disbanded Collaborative Technologies Learning Community at Kent State University, which was funded by the Ohio Learning Network from 2001 to

76

REDESIGNING COURSES FOR ONLINE DELIVERY

2004. We drew from the small group communication and collaborative learning literature in developing a series of applied projects. 7. Social loafing and free loading are interchangeable terms that refer to the tendency for some teammates to put forth less effort on a team than they would if they were solely responsible for the work. 8. Adaptation of the metaphor inspired by Wooliscroft and Phillips (2003). 9. In the case of professional players associated with Major League Baseball in the United States.

REFERENCES Allen, M., Witt, P. L., & Wheeless, L. R. (2006). The role of teacher immediacy as a motivational factor in student learning: Using meta-analysis to test a causal model. Communication Education, 55(1), 21 31. Anderson, T. (2003). Modes of interaction in distance education: Recent developments and research questions. In M. G. Moore & W. G. Anderson (Eds.), Handbook of distance education (pp. 129 145). Mahwah, NJ: Lawrence Erlbaum. Anderson, T., Rourke, L., Garrison, D. R., & Archer, W. (2001). Assessing teaching presence in a computer conference environment, Journal of Asynchronous Learning Networks, 5(2), 1 17. Aragon, S. R. (2003). Creating social presence in online environments. New Directions for Adult and Continuing Education, 100, 57 68. Arend, B. (2009). Encouraging critical thinking through online threaded discussions. The Journal of Educators Online, 6(1), 1 23. Burleson, B. R., Samter, W., & Lucchetti, A. E. (1992). Similarity in communication values as a predictor of friendship choices: Study of friends and best friends. Southern Communication Journal, 57(4), 260 276. Collison, G., Elbaum, B., Haavind, S., & Tinker, R. (2000). Facilitating online learning: Effective strategies for moderators. Madison, WI: Atwood. Cornelius-White, J. (2007). Learner-centered teacher-student relationships are effective: A meta-analysis. Review of Educational Research, 77(1), 113 143. Delia, J. G. (1977). Constructivism and the study of human communication. Quarterly Journal of Speech, 63(1), 66 83. Doyle, T. (2011). Learner-centered teaching: Putting the research into practice. Sterling, VA: Stylus. Eom, S. B., & Wen, H. J. (2006). The determinants of students’ perceived learning outcomes and satisfaction in university online education: An empirical investigation. Decision Sciences Journal of Innovative Education, 4(2), 215 235. Garrison, D. R., Anderson, T., & Archer, W. (1999). Critical inquiry in a text-based environment: Computer conferencing in higher education. The Internet and Higher Education, 2(2-3), 87 105. Garrison, D. R., Anderson, T., & Archer, W. (2001). Critical thinking, cognitive presence, and computer conferencing in distance education. The American Journal of Distance Education, 15(1), 7 23.

Interaction Considerations

77

Garrison, D. R. (2013). Theoretical foundations and epistemological insights of the community of inquiry. In Z. Akyol & Dr. R. Garrison (Eds.), Educational Communities of Inquiry: Theoretical Framework, Research, and Practice (pp. 1 11). Hershey, PA: IGI Global. Giuseppe, R., & Galimberti, G. (1998). Computer-mediated communication: Identity and social interaction in an electronic environment. Genetic, Social, and General Psychology Monographs, 124, 434 464. Gorham, J. (1988). The relationship between verbal teacher immediacy behavior and student learning. Communication Education, 37, 40 53. Hawisher, G. E., & Pemberton, M. A. (1997). Writing across the curriculum encounters asynchronous learning networks or WAC meets up with ALN. Journal of Asynchronous Learning Networks, 1(1), 52 72. Hosler, K. A., & Arend, B. D. (2013). Strategies and principles to develop cognitive presence in online discussions. In Z. Akyol & D. R. Garrison (Eds.), Educational communities of inquiry: Theoretical framework, research, and practice (pp. 148 167). Hershey, PA: IGI Global. Huang, H. (2002). Toward constructivism for adult learners in online learning environments. British Journal of Educational Technology, 33(1), 27 37. Iyengar, S., & Lepper, M. (2000). When choice is demotivating: Can one desire too much of a good thing? Journal of Personality and Social Psychology, 79(6), 995 1006. Janicki, T., & Liegle, J. O. (2001). Development and evaluation of a framework for creating web-based learning modules: A pedagogical and systems perspective. Journal of Asynchronous Networks, 5(1), 58 84. Leach, L., & Zepke, N. (2010). Engaging students in learning: A review of a conceptual organiser. Higher Education Research & Development, 30(2), 193 204. Leong, P. (2011). Role of social presence and cognitive absorption in online learning environments. Distance Education, 32(1), 5 28. Moore, M. G. (1989). Three types of interaction. In M. G. Moore & G. C. Clark (Eds.), Readings in principles of distance education (pp. 100 105). University Park, PA: American Center for the Study of Distance Education. Nicholson, S. (2002). Socialization in the virtual hallway: Instant messaging in the asynchronous web-based distance education classroom. The Internet and Higher Education, 5(4), 363 372. O’Sullivan, P. B., Hunt, S. K., & Lippert, L. R. (2004). Mediated immediacy: A language of affiliation in a technological age. Journal of Language and Social Psychology, 23(4), 464 490. Palloff, R. M., & Pratt, K. (1999). Building learning communities in cyberspace. San Francisco, CA: Jossey-Bass. Palloff, R. M., & Pratt, K. (2005). Collaborating online: Learning together in community. San Francisco, CA: Jossey-Bass. Panitz, T. (1999). Benefits of cooperative learning in relation to student motivation. In M. Theall (Ed.), Motivation from within: Approaches for encouraging faculty and students to excel. New Directions for Teaching and Learning (No. 78, pp. 59 67). San Francisco, CA: Jossey-Bass. Parker, R. E. (2005, January). Design made simple: Leadership training and development for managers. Workshop and materials developed for Leadership Portage County. Rootstown, OH.

78

REDESIGNING COURSES FOR ONLINE DELIVERY

Parker, R. E., Bianchi, A., & Cheah, T. (2008). Exploring student and faculty perceptions of technology in education. Education, Technology & Society, 11(2), 274 293. Parker, R. E. & Child, J. (2009). Study of student expectations of online courses. Unpublished raw data. Parker, R. E. & Coykendall, S. (2012, November). The rate your mate process: Facilitating collaboration through goal setting, shared expectations, and peer accountability. Paper presented at the annual meeting of the National Communication Association. Orlando, FL. Parker, R. E., & Ingram, A. (2011). Considerations in choosing online collaboration systems: Functions, uses, and effects. Research Journal of the Center for Educational Technology, 7(1), 2 15. Parker, R. E. (2003). Distinguishing qualities of virtual groups: An issue-oriented perspective. In R. Y. Hirokawa, R. Cathcart, L. Samovar, & L. Henman (Eds.), Small group communication: Theory and research. (pp. 31 37). Los Angeles, CA: Roxbury. Pelowski, S., Frissell, L., Cabral, K., & Yu, T. (2005). So far but yet so close: Student chat room immediacy, learning, and performance in an online course. Journal of Interactive Learning Research, 16, 395 407. Picciano, A. (1998). Developing an asynchronous course model at a large, urban university. Journal of Asynchronous Learning Networks, 2(1), 1 14. Picciano, V. (2002). Beyond student perceptions: Issues of interaction, presence, and performance in an online course. Journal of Asynchronous Learning Networks, 6(1), 21 40. Pitt, P. L., Wheeless, L. R., & Allen, M. (2004). A meta-analytical review of the relationship between teacher immediacy and student learning. Communication Monographs, 71(2), 184 207. Rogers, C. (1962). The interpersonal relationship: The core of guidance. Harvard Educational Review, 32(4), 416 429. Ruberg, L. F., Moore, D. M., & Taylor, C. D. (1996). Student participation, interaction, and regulation in a computer-mediated communication environment: A qualitative study. Journal of Educational Computing Research, 14, 243 268. Schultz, R. (2010, August 15). Are conductors really necessary? Los Angeles Times. Retrieved from http://articles.latimes.com/2010/aug/15/entertainment/la-ca-what-conductors-do20100815. Accessed on June 15, 2013. Schwartz, B. (2004). The paradox of choice: Why more is less. New York, NY: HarperPerennial. Sheridan, K., Kelly, M. A., & Bentz, D. T. (2013). A follow-up study of the indicators of teaching presence critical to students in online courses. In Z. Akyol & D. R. Garrison (Eds.), Educational communities of inquiry: Theoretical framework, research and practice (pp. 67 83). Hershey, PA: IGI Global. Short, J., Williams, E., & Christie, B. (1976). The social psychology of telecommunications. Toronto, Canada: Wiley. Smith, M., Wood, W., Adams, W., Wieman, C., Knight, J., Guild, N., & Su, T. (2009). Why peer discussion improves student performance on in-class concept questions. Science, 323(5910), 122 124. Stacey, E. (2002). Social presence online: Networking learners at a distance. Education and Information Technologies, 7(4), 287 294. Sunnafrank, M., & Andersen, J. (1991). Interpersonal attraction and attitude similarity: A communication-based assessment. In J. A. Andersen (Ed.), Communication yearbook (14, 451 483). Newbury Park, CA: Sage.

Interaction Considerations

79

Swan, K. (2002). Building learning communities in online courses: The importance of interaction. Education, Communication & Information, 2(1), 23 49. Swan, K., Shea, P. L., Fredericksen, E., Pickett, A., Pelz, W., & Maher, G. (2000). Building knowledge building communities: Consistency, contact and communication in the virtual classroom. Journal of Educational Computing Research, 23(4), 389 413. Swan, K., & Shih, L. F. (2005). The nature and development of social presence in online discussions. Journal of Asynchronous Learning Networks, 9(3), 115 136. Swing, S. R., & Peterson, P. L. (1982). The relationship of student ability and small group interaction to student achievement. American Educational Research Journal, 19(2), 259 274. Vrasidas, C., & McIsaac, M. S. (2000). Principles of pedagogy and evaluation for web-based learning. Educational Media International, 37(2), 105 111. Webb, N. M. (1980). An analysis of group interaction and mathematical errors in heterogeneous ability groups. British Journal of Educational Psychology, 50(3), 266 276. Weiss, R. E. (2000). Humanizing the online classroom. In R. E. Weiss, D. S. Knowlton, & B. W. Speck (Eds.), Principles of effective teaching in the online classroom: New directions for teaching and learning (Vol. 84, pp. 47 51). San Francisco, CA: Jossey-Bass. Winter. Witt, P. L., Wheeless, L. R., & Allen, M. (2004). Meta-analytical review of the relationship between teacher immediacy and student learning. Communication Monographs, 71(2), 184 207. Woolliscroft, J. O., & Phillips, R. (2003). Medicine as performing art: A worthy metaphor. Medical Education, 37, 934 939.

CHAPTER 5 MEDIA CONSIDERATIONS

ABSTRACT Media considerations are pedagogical rather than technological in nature. In online courses, we use technology to enable learner interaction. In this chapter, we focus on a process through which we identify media that will help bring our course to life. Technology tools come and go, quickly. While some specific tools are suggested, it is the process by which to identify and select media that is enduring. We begin with a discussion of media-enabled course activities that are used to guide the selection process. The 10 activities are organized by type of interaction they represent and the media characteristics they require. Media have affordances or functions that can be matched with identified course activities to meet learner interaction needs. These needs help to narrow the scope of our selection decisions. After exploring a variety of functions and tools, we exemplify the media selection process. We extend the work started in previous chapters by identifying media needs in light of design and interaction decisions under the playground and symphony metaphors. In so doing, we demonstrate how the phases of the redesign process inform our technology choices. Keywords: Course redesign; instructional technology; online teaching and learning; educational technology; media selection; online education

INTRODUCTION With the most important considerations made, we are ready to examine the role of media in redesigning courses for online delivery. As we’ve previously discussed, technology should support, rather than dictate, redesign 81

82

REDESIGNING COURSES FOR ONLINE DELIVERY

choices. It is tempting to let the tools drive the process instead of pedagogy. Universities invest in software; instructors try to make use of it, without fully considering its purpose or effects. According to a survey of 4,500 undergraduates at 13 colleges conducted by Educause, students recognize and object to instructors employing technology unreflectively. Many complained about requirements to use chat rooms and discussion boards without instructor moderation and guidance, others complained of instructors who “devoted too much time to teaching students some quirky Web tool at the expense of delivering course material” (Young, 2004, p. A-31). Perhaps of greater concern is when technology yields inappropriate influence on content choices. This can occur with or without instructor awareness. For instance, Clifford Nass, professor of communication and director of the Communication between Humans and Interactive Media (CHIMe) Lab, admitted during an interview with The New Yorker that he once removed a book from his syllabus because he couldn’t figure out how to “PowerPoint” it (Parker, 2001). Instructors new to a university may simply teach the syllabus of previous instructors, unaware of how content decisions were made. Pressure from administrators and students to use technology also contribute to unreflective use. Instructors digitize lectures and/or class discussion without considering alternative approaches or anticipating consequences (Kinchin, 2012; Parker & Ingram, 2011). The Dime model starts course redesign process with basic instructional design decisions, such as course objectives and teaching methods, then examines the role of interaction, with content, instructor, and peers, before considering technology. Now that we’ve considered design and interaction, we’re ready to consider media. The Dime model uses the term media to refer to technology considerations because media represents the fundamental purpose of technology in online courses, which is to support or facilitate learner interaction. Media are the channels through which information is communicated for the purpose of developing shared understanding. According to media richness theory, media fall on a continuum from lean to rich (Daft & Lengel, 1986). Richer channels, like face-to-face communication or telephone, are more immediate, so they carry more information (i.e., through feedback or nonverbal cues) than leaner channels, such as written documents. Ideally, we choose the media best suited to the needs of our task (Daft, Lengel, & Trevino, 1987). In order to make the best media choices in redesigning our course, we must crystalize and articulate the vision we have for our course, which is implicit in our chosen metaphor and developed through our design and interaction decisions.

Media Considerations

83

In this chapter, we focus on a process through which we identify the media that will help bring our course to life. In 1964, Marshall McLuhan wrote Understanding Media, in which he coined the infamous phrase “the medium is the message.” Essentially he argued that the medium is the environment and that, more than any individual factor, it becomes the influencing factor in the creation of meaning (McLuhan, 1994). McLuhan was looking at things from a societal standpoint, but the media chosen during course redesign have the power to restructure course content and relationships. Technology has been shown to have a restructuring effect in organizations by altering who communicates with whom, how often, and by what means (Barley, 1990). By recognizing the potential consequences of technology, instructors can better harness its power to meet the needs of their content, their learners, and their instructional style. Many of the specific tools discussed later in the chapter will be outdated as soon as this volume is published. The main takeaway from this chapter isn’t the tools; it is the process by which to identify and select them. The goal is to provide a framework for making reflective choices.

IDENTIFYING MEDIA NEEDS Educational technologies are often organized around features. Common features include document sharing, blogs, polling, and instant messaging. Organizing tools in this way highlights their affordances, but may lead to unreflective choices. A better way to organize media would be around the teaching and learning activities they enable. As we discussed in Chapter 3, online education lends itself to a “learn by doing” process. Borrowing from the management literature, Sharp and McDermott (2001) define a process as “a collection of interrelated activities initiated in response to a triggering event, which achieves a specific, discrete result for the stakeholders of the process” (p. 58). We view media selection in course redesign similarly. Media are needed to enable specific, interaction-related activities that comprise online education. The teaching and learning activities media will enable are implicit in the choices made in previous phases of redesign. Later in the chapter, we will revisit these decisions and explicitly identify the activities media choices should enable within our particular course. First, let’s explore the activities that form the basis of the course media selection taxonomy.

84

REDESIGNING COURSES FOR ONLINE DELIVERY

Media Enabled Course Activities The media selection taxonomy begins with 10 interaction-related activities. These activities are carried out through technology and will guide media selection, covered in a later section. First, we need to define the activities and discuss a means for recognizing their role as revealed in choices made during the design phase. Courses vary in the activities they include; it is unlikely a course will involve all 10 activities. The activities are summarized in Table 5.1. Table 5.1. Activity

Media Enabled Course Activities. Description

Related Action Verbs

Content-related Informing

Providing new information

Describe Review Outline

Explaining

Clarifying information

Explain Examine Compare/contrast

Showing

Demonstrating examples and processes

Display Demonstrate Structure

Engaging

Integrating interactivity to get learners “doing”

Assign Develop Reinforce

Coordinating

Organizing learner experience

Organize Manage Match

Conferencing

Enabling simultaneous interaction with instructor and/or peers

Meet Talk Direct

Responding

Providing feedback

Evaluate Respond Encourage

Networking

Facilitating connections between peers

Facilitate Introduce

Collaborating

Co-producing outcomes with one or more peer

Collaborate Distribute

Simulating

Immersing learners in authentic situations

Immerse Captivate

Instructor-related

Peer-related

Media Considerations

85

We begin by investigating those activities most related to delivery of content. Informing Presentation of information is a primary activity in teaching and learning. The activity of informing is grounded in a cybernetic concept of information; meaning information is comprised of facts and the communication of facts (Norbert, 1954). In teaching and learning, information is knowledge communicated. The communication of course content is the primary action associated with informing. Explaining Clarification of information is a complement to informing that is directed toward reducing uncertainty and ambiguity and generating deeper understanding of course content. Uncertainty refers to perceived vagueness in content whereas ambiguity refers to potentially conflicting interpretations (Schrader, Riggs, & Smith, 1993). Explaining is the activity that aids learners in thinking about and using information. Showing Demonstration supports learners in skill development. Showing learners how to apply course concepts and processes helps move learning beyond abstract understanding to the higher levels of learning associated with the cognitive domains of application, analysis, and creation, originally set out by Bloom (1956) in his taxonomy of learning objectives and later revised by Anderson, Krathwohl, and Bloom (2001). Active learning, through which students see, hear, and do depends upon showing learners the way (Silberman, 1996). Engaging Making content interactive is the key to engagement in online courses. According to Merriam Webster Online, interactivity involves mutual or reciprocal action [1]. It involves “the actions or input of a user; especially: of, relating to, or being a two-way electronic communication system” utilizing user commands or responses [2]. Engagement includes learner experimentation with course concepts, during which they provide input and reactions to the process and exercise a measure of independence and control (Sims, 2003). Engaging learners in discussions, games, or problembased learning precludes learner passivity, fostering deeper level learning

86

REDESIGNING COURSES FOR ONLINE DELIVERY

(Douglas, 2012). Next we look at activities most related to learner-instructor interaction. Coordinating The absence of fixed meeting times sets up the need for more explicit management of learners and learning activities. This activity mostly falls in the purview of the instructor and consists of actions such as communicating course expectations, setting assignment schedules, arranging course materials, and organizing learner interactions. As learner autonomy and control increase, so may the need for learners to coordinate their own interactions with content, instructor, and/or peers. Conferencing There may be times in an online course when learners need to come together virtually with the instructor and/or peers. Requiring synchronous meetings may violate some learner expectations (as well as those administrators who market online courses as learning anytime), but there is support in the literature for better and more satisfying learning occurring when at least some sessions are synchronous (Grant & Cheon, 2007; Little, Passmore, & Schullo, 2006; McBrien & Jones, 2009). From recent personal experience utilizing conferencing in an online course, social presence was increased and a learning community was visible. Synchronous sessions are rich, but do introduce a level of complexity that may or may not be warranted. As with all choices, the key will be to match the approach to the content, learners, and instructor. Responding Provision of feedback comprises the activity of responding. Feedback is a primary means by which learning occurs. Chickering and Gamson (1987) identify the provision of prompt feedback as one of the seven principles of good practice in undergraduate education. Feedback may be developmental or evaluative in nature and could come from a variety of human or computer sources. The final set of activities is most closely associated with peer interaction. Networking Whether peers will be cast as information resources, collaborators, or social supporters, they will need to make connections and build relationships. Perceptions of learning and satisfaction are enhanced by the sense of belonging that comes with membership in a learning community (Alavi & Dufner, 2005). For instance, a study of 314 online learners enrolled in

Media Considerations

87

26 graduate courses, found a significant relationship between the strength of community and perceived cognitive learning (Rovai, 2002). Learners will need to make peer connections through the activity of networking. Collaborating Casting peers as collaborators sets the expectation that learners will engage in the co-production of outcomes. Collaborating learners participate actively in all aspects of a project; they don’t just divide up the tasks for individuals to perform, assembling them into some unit of production at the end. Collaborating online involves the integration of the efforts of learners through interactions that are mediated (Parker & Ingram, 2011). Simulating Immersion of learners in authentic situations allows for the simulation of actions and consequences. Simulations increase learner engagement, facilitating deeper learning through complex applications of content through authentic tasks (Driscoll & Carliner, 2005). Assuming learners are interested in knowing how to perform these tasks, motivation for learning will strengthen, reducing attrition and increasing rates of successful completion (McKeachie, 2002). Simulations may also allow for adaptive learning, adjusting aspects of the experience to meet individual learner needs (Kirkley & Kirkley, 2004). Some research indicates boredom on the part of students if the simulations are highly text based or poorly designed (Smart & Cappel, 2006). They may involve substantial development effort and a steep learning curve for users. Learners will likely engage with content and other learners while simulating. For purposes of examining media, it helps to think of the teaching and learning activities in terms of the three types of learner-centered interactions discussed in Chapter 4. In the next section, we connect learnercontent, learner-instructor, learner-learner interactions to the characteristics of the media that enable best them.

Activities by Interaction Type and Media Characteristics Each of the 10 teaching and learning activities easily fits with one of the three types of interaction involved in online learning. These interactions are mediated through technology (media) and activities that fall within an interaction type require similar media characteristics to enact it. When it comes to selecting specific technology tools, there are countless possibilities.

88

REDESIGNING COURSES FOR ONLINE DELIVERY

Organizing the activities by interaction type brings into focus what you need the tool to do, effectively narrowing your options, which improves decision-making (Scheibehenne, Greifeneder, & Todd, 2010). Later, we’ll explore specific tools based upon this organizing scheme. Table 5.2 summarizes activities by interaction type and media characteristics. Content-Related Activities The activities of informing, explaining, showing, and engaging primarily involve learner interaction with content. Instructors or peers may facilitate the activities (i.e., the instructor provides directions or develops a presentation), but it is the interaction of the learner with the content, regardless of the source, that is the primary focus. The content-related activities of informing, explaining, and showing are largely one-way in that information flows in one direction from the content to the learner. They tend to be planned and prepared in advance for learners’ consumption. They are rarely interactive; when they do afford learner input of some sort, the activity becomes engaging. Engaging is the contentrelated activity that demands the learner take action in order to continue the activity. Learner actions, such as posting a message, matching words, or playing a game, make it a two-way interaction. Interactions may be synchronous (although computer automated) or asynchronous. Table 5.2. Activity Content-related Informing Explaining Showing Engaging

Course Activities by Interaction Type and Media Characteristics. 1-Way

Peer-related Networking Collaborating Simulating

2-Way A-Sync

X (auto)

X

X X X

Instructor-related Coordinating Conferencing Responding

2-Way Sync

X X X

X X X X

X X

Media Considerations

89

Instructor-Related Activities Coordinating, conferencing, and responding are considered instructorrelated activities because they directly connect the instructor and learner with one another, either synchronously or asynchronously. To be considered synchronous, interactions involve simultaneous participation by another person, in this case the instructor or learner. Asynchronous interactions may sometimes feel synchronous (i.e., learner sends an email message that instructor receives and responds to immediately), but in light of media characteristics, it would still be considered asynchronous. The nature of the tool is transmission (send, receive, send, receive) rather than reciprocal (chat in real time). In looking at instructor-related activities, we could drill down to make more specific connections with the categories of instructor interaction discussed in Chapter 4. For instance, administrative interaction connects well with the activity of coordinating. Facilitative interaction fits naturally with responding, as does relational interaction with conferencing. These connections are not mutually exclusive. For instance, the instructor’s approach to responding, as well as the learner’s reaction, will influence, and be influenced by, their relationship. For the purposes of media selection, mapping the activities can be useful. Peer-Related Interactions Networking, collaborating, and simulating are activities that directly connect peers with one another in either synchronous or asynchronous interactions. The inter-related nature of these activities will require rich media affordances to enable them. The complex nature of the interactions will likely require some complimentary instructor interaction such as coordinating. There may also be a steeper learning curve for students when using media to enable collaborative peer interactions. Next, we turn to types of media that will enable course activities. Activities organized by interaction type and primary media function are summarized in Tables 5.3 and 5.4 along with sample tools. Tools are exemplary rather than prescriptive. A full discussion connecting interaction, activities, and media functions follows.

FINDING MEDIA TO MEET NEEDS Functions are broad categories of program features that facilitate interaction processes (Parker & Ingram, 2011). Functions have intended purposes,

90

REDESIGNING COURSES FOR ONLINE DELIVERY

but are sometimes adapted by users and used in other ways. Richer media lend themselves to more unexpected or ironic uses by users. Instructors will want to consider and monitor actual learner behavior as these uses could have a restructuring effect (Scott, Quinn, Timmerman, & Garrett, 1998). We begin our discussion of course activities and media functions with those that enable learner interaction with content. We organize the discussion of media needs around the interaction related activities that will be supported. Media for Content-Related Activities We begin our discussion of media needs with activities related to learner content interaction. These are informing, explaining, showing, and engaging. Table 5.3 intersects the four content-interaction activities with their primary media function. It then provides a list of sample tools, some of which are explored further within the discussion that follows. Informing Information presentation tools support the activity of informing. E-texts and documents are some of the simplest and most common ways to present information. Whether learners will actually read the material is a consideration. Discussions on this topic are happening on campus and online. Conversations focus on how to get students to read. For a sampling of these conversations, visit The Teaching Professor blog from Faculty Focus.1 When asked, students in my own classes report, “you have to make us.” Research on the scholarship of teaching and learning appears to echo student sentiment with studies on strategies to increase student compliance (e.g., see Hoeft, 2012). Table 5.3. Activity

Sample Tools to Enable Content-Interaction Activities. Informing

Explaining

Showing

Engaging

Primary media function

Information presentation

Talking (Audio)

Screencasting

Interactivity

Sample tools

E-text Slideshare Prezi Haiku Deck Storify YouTube Open Educational Resources (OERs)

Audacity Pod-O-Matic Odiogo Chirbit Voicethread

Camtasia Jing Explain Everything Panopto Screencast

Hot Potatoes Quizlet Quest Garden Glomaker Softchalk Merlot OERs

Media Considerations

91

In addition to assigned readings, “slideware” such as PowerPoint or Keynote are also popular tools for informing. They are frequently hosted on a server such as Slideshare, so students cannot, or do not have to, download them. Slideware are also commonly combined with audio tools to create a narration to the slideshow and then converted to video and shared using a service such as YouTube. This permits learners to stream them. Bandwidth should be a consideration in the selection and use of any tool. Bandwidth refers to the amount of data that can be transmitted over a connection and how fast that transmission can take place (generally expressed in bits per second). In areas without broadband Internet access, this will be of particular concern. To get a sense of how long it will take learners to download a file, use one of the many download speed calculators freely available on the World Wide Web.2 Another set of tools that enable informing are those that allow for bringing together information from a variety of web sources using a social bookmarking site such as Pinterist or Delicious. Tools such as these allow learners to see relationships between pieces of information based upon its placement on the “board.” Learners can access the original information source directly from a link on the board. More advanced tools enable the embedding of web materials from blogs, websites, and twitter into a story format. Programs like Storify support the development of a narrative or story around the information. This enables learners to see content within a specific context. Explaining Audio tools support explaining. They can be used alone, such as in a podcast, or in conjunction with an information presentation tool such as PowerPoint. Audio is flexible and easy to produce, making it particularly useful in providing clarification of information presented. Audacity is a free, online digital recorder and editor that allows for the creation of audio files. Pod-o-matic enables the creation of podcasts, which were originally audio only. Today, you can add images and other materials to a podcast. One advantage to podcasts is that learners can download them and listen any time they want; podcast is an acronym meaning “Portable, On-Demand, broadCasting.” Voicethread is a web-based application that enables instructors and learners to comment on images, videos, or documents. Completed “conversations” can be saved and embedded on web pages or blogs. Chirbit is a similar tool that also enables the embedding of audio in web pages or social media like Facebook, Twitter, or Tumblr. You can record directly from

92

REDESIGNING COURSES FOR ONLINE DELIVERY

your browser and even generate a QR code for each audio post. QR codes are two-dimensional bar codes that can be read by mobile devices using a reader app. This makes Chirbit particularly powerful in carrying out the activity of explaining. You can also use it to extract audio from sources such as YouTube, which can be helpful when bandwidth is an issue. Finally, Odiogo is an audio reader that you can add to your blog or other site to allow “readers” to listen to entries by way of an embedded “listen” button. Text to speech technology allows learners to transform content into alternate formats based upon preference or need. For instance, learners may prefer audio to text as it allows them to exercise or drive while they study. Learners with visual impairments can readily transform information into a format they can use. Showing Screencast tools enable the activity of showing, which is especially useful in facilitating learner interaction with content. Showing can also be used to demonstrate skills learners need to develop. Camtasia is a screen capture tool that enables you to create video tutorials. Tutorials can be used in showing multistep processes that are difficult to explain using just text and images. Tasks such as using computer applications or balancing a spreadsheet are examples. You complete the task on your computer, recording all of your activity while recording a description of what you’re doing, step-by-step. Jing is a free, web-based program that enables instant screencasts of up to 5 minutes in either image (screenshot) or video formats; screencasts can be created with our without audio. They can be downloaded or uploaded to the Screencast.com server (operated by Jing’s publisher) and shared with learners by way of a URL. This tool is a personal favorite for decisively answering learner queries about where to locate something within a course space or how to complete a particular task. I create a screencast to answer one learner, and then share the link with other learners that likely have the same question. Explain Everything is a screencasting tool with an interactive whiteboard. Specifically designed for the Apple iPad, this tool enables narration, annotation, and animation of whatever is on your screen. This may be particularly useful if you wish to demonstrate mathematical computations or make clarifying remarks at particular points within video files. The end product can be exported to a variety of file types (.pdf, .wav, .jpeg) and sent to a variety of destinations such as a website, Dropbox, or learning management system (LMS).

Media Considerations

93

Engaging Interactive content requires input from learners, engaging them with the content. It can take a variety of forms and is generally regarded as a positive enhancement to the learning experience as interactivity tends to empower learner autonomy and interest. More research is needed to establish whether interactivity of content alone increases learning, but some results are encouraging (Jung & Choi, 2002; Zhang, Zhou, Briggs, & Nunamaker, 2006). As previously discussed, interactivity introduces a level of complexity to learning that should be considered. For example, results of a meta-analysis of computer-assisted instruction, media richness, and student performance suggested that using a medium that carries too much information may result in the receiver experiencing content overload (Timmerman & Kruepke, 2006). Interactivity can take a variety of forms; choose the function and tool that enables the specific activity that fits you, your content and your learners. Forms, surveys, and quizzes are common functions that support the activity of engaging. Quizzes and surveys can sometimes add interest or assist in assessing knowledge or interest. All can be made simply using Google Docs. Hot Potatoes is freeware for creating interactive, web-based quizzes, crossword puzzles, and matching activities. Quizlet enables the creation of digital flashcards that students can manipulate. Flashcards and matching games aid in concept memory. To be more engaging, consider guided inquiry. QuestGarden enables structured web quests through which students search for information and solve problems. Or use Ted-Ed (the education arm of TED talks) to make interactive videos by choosing a video from YouTube and structuring learners’ interactions with it. You can add multiple-choice and open-ended questions, link students to related articles and blogs, or start a guided or open-ended discussion, all within the tool. If you’re short on time or confidence to produce your own interactive content, there are many already available through open educational resources (OERs) that feature interactive content. To find relevant resources, search using Google, enter the search terms "open education*resource*". Other sources of useful, interactive materials are the National Repository of online courses offered through hippocampus.org and the Open Learning Initiative. There are many content-related media to choose from. Choose those that fit best with you and your learners. Considerations such as cost, system compatibility, learning curve, and fit with content needs should inform your decisions. Next we turn to media functions and sample tools that enable learner-instructor interaction.

94

REDESIGNING COURSES FOR ONLINE DELIVERY

Media for Instructor-Related Activities We continue our discussion of media needs with activities related to learner-instructor interaction. These are coordinating, conferencing, and responding. Table 5.4 intersects the three instructor-interaction activities with their primary media function, and provides a list of sample tools to use in their support. A discussion of media that enables instructor-related activities follows. Coordinating Management tools designed to bring people together are effective in enabling the activity of coordinating in online classes. Coordinating most closely relates to the administrative and facilitative categories of learnerinstructor interaction. Some of the tools will also support relational interaction, depending upon how they are employed. Web-based calendaring tools such as When is Good and Doodle assist with the scheduling of meetings. These tools can also support peer collaboration in setting meetings either with or without instructor involvement. These applications have free versions as well as premium versions that provide more functionality for a subscription fee. Google Voice is a service that enables web-based calling or texting by way of a number given to learners, which forwards to the instructor’s cell phone. This enables both synchronous and asynchronous interaction without requiring the instructor to give out a personal cell number. Phone and text access increases teacher immediacy by shortening response times. Learner messages are forwarded to a cell phone; instructors may choose to receive the messages as either voice or text; one of the program features is Table 5.4. Activity

Sample Tools to Enable Instructor Interaction Related Activities. Coordinating

Conferencing

Responding

Primary media function

Organizing

Meeting

Feedback

Sample tools

When is good Doodle Calendar Facebook Fan Page Google Voice Eyejot

Free Conference Call Skype Big Blue Button Adobe Connect

iAnnotate Adobe Pro VideoAnt Calibrated Peer Review

Media Considerations

95

speech to text translation. There is no need for the instructor to be online to receive messages. Eyejot is a video messaging platform that enables sending video messages asynchronously (i.e., via email). Whether students would welcome video messages from a professor is unknown. Therefore, instructors may want to consider learner preferences. The richer nature of video could support relational interaction. Coordinating activities might also include expectation setting and facilitating learner interactions by reminding them to be active in the course. Creating a Facebook fan page that learners can “like” enables posts to student Facebook feeds without having to be “friends,” something students may wish to avoid. This can be a good way to communicate short messages to your class without violating their expectations. The research shows students are divided about whether being “friends” with their instructors is desirable (Helvie-Mason, 2011). Not all students wish to “friend” faculty members, preferring instead to keep their social and professional realms separate (Hewitt & Forte, 2006). Personally, I favor separation of my social and instructional lives and I prefer not to manage more than one Facebook account. For students and instructors who feel differently, or for those who have better mastered privacy controls, Facebook has been shown to increase student perceptions of teacher immediacy and class motivation (Mazur, Murphy, & Simonds, 2007, 2009). Once again, choose and employ the tool in the way that best fits your content, your learners, and your style. Conferencing Tools that allow for synchronous, verbal communication between interactants enable conferencing. Conferencing can easily be used to support learner interactions with content, instructor, and peers. These tools are considered rich as they enable synchronous audio and video interactions. There is a learning curve in using conferencing tools, but it tends to be steepest for the instructor, depending upon how the tools are used. Most conferencing tools have built in recording capability, enabling the meeting to be watched later. If you choose to record the session, be sure to alert learners that their audio and text comments will be recorded along with the rest of the conference. Skype is a tool that enables video calls. The freeware version is most useful for one-to-one calls; group calls are limited to audio only. Premium account upgrades allow for video calls with multiple attendees. Screen sharing is a feature of Skype, when video is turned on. Web conferencing software such as Adobe Connect (educational pricing available) or Big Blue

96

REDESIGNING COURSES FOR ONLINE DELIVERY

Button (freeware) enable multiple attendees at web-based meetings, either with or without video. Meetings typically involve some type of slideware. “Presenters” can share their screens to include viewing web pages. Shared viewing of videos does not work well through tools presently available. Bandwidth demands are high for conferencing tools. Free Conference Call is a telephone conferencing tool that allows interactants to phone into a meeting. This is an audio-only tool. There is no cost for instructor or learners beyond those associated with normal telephone use (depending upon individual plan details, cell minute or toll charges will apply through your phone service provider). The premium version allows for toll free calling, but cell minute costs would remain the same for users. This can be a good solution when bandwidth is an issue. Responding Feedback mechanisms enable responding. Feedback is a primary means by which learning occurs. Chickering and Gamson (1987) identify the provision of prompt feedback as one of the seven principles of good practice in education. Richer media tend to support more detailed and timely feedback. Feedback on learner submitted documents can be provided using text, audio, or screencasting. The “comment” and “track changes” features of Word enable text-based feedback, which chronicles every change to a document. This feedback can become a bit unwieldy for students to process as the margin of the document becomes cluttered with comment bubbles filled with deleted and moved text. Comment bubbles with actual comments are not distinguished from those denoting text changes, potentially masking instructor input. From personal experience, track changes is time consuming for the instructor if the document is in the early stages of development, but it is quite useful for close editing of nearly finalized documents. Adobe Pro enables audio comments to be added to .pdf documents, so your comments are “attached” to the portion of the document to which they most relate. This can be very useful when reviewing early drafts when feedback is primarily conceptual in nature. Screencasting works well for more visual submissions. For instance, images or web pages can be captured as you talk through them, providing comments at key moments and recording the session for the learner. For videos, VideoAnt enables text annotations for any video hosted on YouTube as well as those that are formatted as .mov or .flv. The comments are attached to particular segments of the video. This works well on assignments such as digital presentations or video productions.

97

Media Considerations

If responding is to be done by peers, instructors will need a means to both facilitate the process and ensure its effectiveness. Calibrated Peer Review is a tool that helps to train and organize learners in providing feedback to one another on writing assignments. There are also auto-responding systems available to provide students with personalized, immediate feedback using established algorithms. When teaching business writing, I use a tool called My Access that provides feedback on student writing using a six-point rubric. The feedback is available immediately upon submission, allowing learners to revise and resubmit. This tool works well for my students in meeting my course learning objectives. Auto-responders are not “no-man” systems, however. The instructor remains the ultimate evaluator, but these systems can be beneficial for both learners and instructors. As with all media, choose those that fit the needs of the learners and instructor best, given the course content. Considerations such as learner and instructor expectations are essential. Unmet expectations impede relationship building and overall satisfaction with the learning experience. Next we turn to media functions and sample tools that enable learner learner interaction.

Media for Peer-Related Activities The final set of activities we consider in terms of media functions are those that relate to learner learner interaction. Tools we explore are those that enable the activities of networking, collaborating, and simulating. These are immersive activities that will benefit from both synchronous and asynchronous tool features. Table 5.5 intersects the three peer-interaction activities with their primary media function and suggests sample tools that could be used to enable the activities. Networking In online courses where peer interaction is privileged, networking is more than merely enabling connections between peers; it is a means to facilitate relationship building to support activities like collaborating. Relationships require trust and trust entails predicting and depending upon the behavior of others (McDaniel & McDaniel, 2004). Perceived similarity also supports relationship development. In online courses, establishing social presence among learners is essential for relationships to develop. Tools that assist in enhancing social presence are those that help peers perceive one another as individuals, even when they can’t “see” them.

98

Table 5.5.

REDESIGNING COURSES FOR ONLINE DELIVERY

Sample Tools to Enable Peer-Interaction Related Activities.

Activity

Networking

Collaborating

Simulating

Primary media function

Social networks

Co-creating

Virtual world

Sample tools

Bios/Profiles in LMS Facebook Twitter Celly

Google Suite Diigo Stixy Writeboard Bubb.us Thinkature Zoho Projects

Secondlife Activeworlds Atlantis Remixed OpenSimulator There OERs

Learning Management Systems (LMS) have varying features that can assist. If you have a choice of systems, or of features you use in an assigned system, you might consider the following. Having users upload a profile photo that will appear next to their name anytime they post to a discussion board or comment through the LMS. This helps even if the photo is not of the learner (i.e., a cat, a flower, a truck). Instructors should set the expectation that the image chosen should remain constant for the duration of the course. As discussed in Chapter 4, presence awareness indicators facilitate peer interaction and networking by pointing to whom else is online and available to chat. This is the virtual equivalent of “bumping into one another,” which enables interaction. There are a variety of social media means by which to facilitate networking activities. Instructors can encourage learners to connect through social media such as Facebook or Twitter or through the more professionally oriented tool LinkedIn. More pedestrian tools like email and text messaging can also provide a means for learners to reach out to one another. Individuals build relationships with other individuals, not with groups or organizations (Bullis & Bach, 1991). Therefore, consider designing assignments that will support the building of interpersonal relationships, and then guide learners to tools that will help. Collaborating For learners to collectively generate outcomes, they will need tools that enable them to share information, support relationships, and co-produce work. Collaborative tools are two-way with synchronous and asynchronous features. Learners may need guidance in choosing the features that best support the work. Text based projects are well supported by document sharing and storage tools such as Google Docs or Dropbox. When further supplemented by communication tools such as Google Groups, for

Media Considerations

99

discussions and messaging, and Google Hang Outs, for video conversations, the depersonalizing effects of computer-mediated communication can be reduced (Parker, 2003). Web bookmarking tools can be useful for creating a library of resources for the team. Diigo is a social bookmarking site that allows content to be shared with select individuals. Evernote is software that functions as a web clipper, but it also allows you to share files and create notes (text, audio, or video), all organized in “notebooks” that can be shared with select individuals. For more complex projects, consider richer, more dynamic media. Zoho Projects includes a robust set of tools such as calendaring, document sharing, messaging, as well as task management features, which appear on a dashboard and tie everything together. In addition there are social presence features such as member pictures and profiles to aid in feeling co-present. Thinkature similarly features a dashboard, with a less sophisticated format, but enables teams to collaborate on projects in real time, co-creating diagrams or slides. Simulating Immersive learning environments such as virtual worlds or simulations enable learners to model behaviors they might use if faced with a situation in “real life.” Simulations can feel “fun” and provide opportunities for practice. Virtual worlds are excellent places to simulate complex tasks such as medical procedures or accident investigations. Second Life is one of the best-known, 3-D virtual world programs. Learners create avatars that realistically represent the self, or they can appear as animals, robots, or vehicles. Some universities feature private “islands” upon which educational activities take place. Instructors using an unrestricted island may have “visitors” entering the class, which can be disruptive just as it would be in a face-to-face class. Think of virtual worlds as just that, the world. In it you will find all types of activities and all kinds of people. Some caution is recommended, but there are some excellent examples of simulated learning taking place in Second Life (see Beard, Wilson, Morra, & Keelan, 2009, for a review of health-related activities on Second Life). Other 3-D virtual worlds are available, each with slightly different features. In the following “worlds” you sacrifice some of the visual richness, but gain more in terms of control. Activeworlds has an education only island, Atlantis Remixed was solely designed for educational purposes, and Opensim (Open Simulation) allows for self-hosting to more fully control the experience. There are many ways to engage students in scenario applications without embarking into virtual worlds. They can be carried out using case study

100

REDESIGNING COURSES FOR ONLINE DELIVERY

applications, Internet games or commercially produced simulation modules. For our purposes, we included simulated activities that don’t involve peers under the activity of engaging. Choose the activity and media with the shortest learning curve, so long as they meet course needs. It’s easy to get stars in your eyes over tools. The possibilities seem endless, but we don’t want to use more tools than we need; media use is not an end in itself. Chris Hoadley, associate professor and director of the Educational Communication and Technology program at New York University, proposes three laws of educational technology that fit well with our overall approach. Hoadley (2013) argues: .

1. It’s not the technology, it’s what you do with it. 2. It’s not what the technology makes possible, it’s what the technology makes easy. 3. Pay attention to the trends in learning, not in technology. To facilitate the selection of media that is best suited to content, learners, and instructor, we need to revisit the work completed in previous phases of course redesign.

SELECTING MEDIA TO FIT PREVIOUS REDESIGN DECISIONS We can see how it would be easy to get overwhelmed without a structured process for media selection. The most critical step in this process is accurately identifying the activities you need media to enable. To do this we need to revisit the course redesign work completed during the Design and Interaction phases of the process. The action verbs we used in defining learning objectives, along with those used in outlining choices for information to share (S), skills to illustrate (I), practice to guide (G), and progress to nurture (N), will reveal the course activities and types of interaction involved. These can then be matched to specific tools, with or without the help of an educational technologist.3 To demonstrate this process, let’s bring the phases of course redesign together in a diagram that can function as a graphical organizer (i.e., storyboard) for our course. We’ll extend our redesign efforts from Chapter 4 by applying graphical organizers to the playground and symphony metaphors. As previously discussed, each metaphor implicitly organizes the course into modules within which associated activities will fit. The graphical organizer

101

Media Considerations

makes the implicit explicit, pulling course activities into the course layout to generate a list of specific media needs that will narrow the scope of the media selection process. Extension of Playground Metaphor As discussed in Chapter 4, the playground metaphor implicitly organizes courses around individual content elements, making it ideal for survey courses in which learners are initially exposed to content with little connection to their previous knowledge. Each piece of playground equipment might be representative of a set of course activities derived from your learning objectives and other design choices and the types of interactions we associated with those choices. By analyzing previous decisions, we can easily identify course activities, to be enabled by media, in order to achieve them. We’ll begin with a quick review of the redesign work we accomplished previously. Looking at the course through the lens of the playground metaphor, the following decisions were made related to learning objectives and instructional methods. We intentionally made the objective skill-based to fit with a learn-by-doing approach (see Chapter 3 for a full description of the process). In keeping with an action-oriented approach, we articulated aspects of our design related to instruction methods using parallel phrasing, each choice beginning with an action verb. To refresh, those aspects are decisions we made about information to share, skills to illustrate, activities for practice, and the means to nurture progress. Table 5.6 synthesizes the design and interaction decisions made for a higher education course in organizational communications in which students were mostly unfamiliar with the content. This is an abbreviated version of the design document developed in Chapter 4. We’ll be using this information in our analysis of course activities that will let us identify our media needs. Graphically Organizing the Course As a reminder, the playground metaphor lends itself to organizing the course around separate content modules. As equipment on a playground, the metaphor implies content can be chunked into independent units that function separately, but collectively move the learner to the objective. As on a playground, there is no interdependency across modules, but improved ability in one unit will likely strengthen performance in others. Revisiting the learning objective, we can identify five basic performance elements. Learners are to (1) gain experience in (2) communicating

102

Table 5.6.

REDESIGNING COURSES FOR ONLINE DELIVERY

Revisiting Redesign Decisions for the Playground Metaphor.

Objective

Learners will gain experience in communicating with customers, managers, and peers by creating messages designed to achieve an assigned purpose for a specific audience using various formats and correct grammar

Share

(1) Display various message formats to familiarize learners (business letters, memos, reports, email, brochures, newsletters) (2) Describe rationale and process of audience analysis to prepare them to match messages to the audience (demographics, size, status, perspectives on topic) (3) Outline steps in the writing process to prepare them for creating messages (organization, research, editing, revision, proofreading) (4) Review of writing mechanics to help them use correct grammar (punctuation, capitalization, spelling, bias-free language)

Illustrate

(1) Explain effective use of each message format (2) Demonstrate audience analysis (3) Compare and contrast effective and ineffective messages (4) Show common writing errors

Guide

(1) Assign practice in writing mechanics (2) Structure tasks related to the writing process (audience analysis task; research task; outline message task; draft and revision task) (3) Develop messages for submission

Nurture

(1) Encourage self-assessment using rubrics (2) Evaluate writing mechanics through auto-feedback mechanism (3) Respond with performance feedback on submissions

messages, (3) using different formats and (4) correct grammar, so they (5) reach a given audience. Fig. 5.1 depicts a diagram of the course based upon the objective and inspired by the metaphor. The course has five boxes, each representing a different piece of playground equipment from the metaphor. Each is labeled with the content most closely related to one of the five performance elements provided by the objective: the writing process equates to “communicating messages,” audience analysis process equates to “reaching a given audience,” mechanics of writing equates to “correct grammar,” along with boxes for document “formats,” and “gaining experience.” Each box represents a learning module and is filled with action verbs related to previously chosen design elements comprising the instructional methods: share information; illustrate skills; guide practice; and nurture progress. The result is a graphical representation of the overall course organized into five learning modules. Later we will connect the action verbs to one of the 10 course activities described previously; this is how we’ll determine our media needs. Again, the verbs are drawn from the redesign

Media Considerations

Fig. 5.1.

103

Course Diagram of Learning Activities: Playground Metaphor.

decisions made in Chapter 4 and depicted in Table 5.6. They are placed in the module for which the action provides support. Let’s look at one of the action verbs by way of example. Take the action verb outline (the process) from the share information section of Table 5.6. It is placed in the Writing Process module because learners will need this information shared with them before they can engage in the writing process. One of the actions that will help learners to use the writing process is to have the parts of the process outlined for them. In looking at the modules in Fig. 5.1, we see each contains varied instructional elements. For instance, the writing process not only involves information to be shared, but also guided practice (structure the tasks) and nurtured progress (encourage self-assessment). Let’s unpack things a bit further by looking at another module. In the Mechanics module, there is information to be shared (review rules), skills to be illustrated (showing errors), and practice to be guided (assigning activities). This module integrates various instructional methods to achieve the performance of correct grammar required by our objective. Other modules are similarly organized. Match the action verbs listed in the remainder of the modules back to those used in the design document contained in Table 5.6 to get the full effect. Once the layout and content of course modules are decided, we’re ready to translate these design elements into the course activities we discussed at the start of the chapter. Again, these activities will provide the framework for identifying our media needs, which will guide our media selection process.

104

REDESIGNING COURSES FOR ONLINE DELIVERY

Identifying Course Activities Previously, we identified 10 course activities that related to one of the three types of interactions examined in Chapter 4. Table 5.7 lists the activities and connects them to the action verbs used in our design plan (see Table 5.6). Notice how actions listed under sharing information fit neatly into the content-related activity of informing. Similarly, actions listed under illustrating skills fit into the activities of explaining and showing. Actions under guiding practice closely relate to engaging. Nurturing progress is closely tied to the instructor-related activity of responding. Notice the absence of peer-interaction related activities in Table 5.7, which makes sense given the playground metaphor. As discussed in Table 5.7. Course Activity

Course Activities and Media Needs: Playground.

Action Verb

Media Need

Sample Tool

Informing

Outline Describe Review

Slideware e-text Means to assemble materials from across web with explanation

PowerPoint (e-text publisher) Storify

Explaining

Explain Compare & contrast

Podcast with images Pod-o-matic Audio connected to documents with Voicethread ability for discussion

Showing

Display Demonstrate Show Structure

Sample documents Screencast video Screencast images with audio Interactive whiteboard (for diagramming and student response)

Pdf files Camtasia Jing Explain- Everthing

Engaging

Assign Develop

Games; ALE; quizzes Scenarios

Grammerly; Hot Potatoes Glo Maker or OERs

Respond

Text, audio or screencast

Evaluate

Auto-response, quiz or text

Encourage

Interactive Rubric

Audio comment in Adobe Pro Google docs or quiz tool in LMS Rubric tool within most LMS

Coordinating Conferencing Responding

Networking Collaborating Simulating

105

Media Considerations

Chapter 4, learner-content interaction is most privileged because, like children on a playground, learners will explore the equipment (modules) mostly independent of one another. A child on a swing may swing next to another child, but they don’t swing collaboratively. So it would go in a course framed by the playground metaphor. You may decide your learners will enjoy the course more if they were aware of one another (swinging can be more fun if you have another person next to you to motivate you to swing higher). Make that choice based upon content, learners, and instructor. If it fits, add networking activities to the list in Table 5.7; indicate the actions you’ll build into your course to facilitate connections. Similarly, notice that only responding is featured as an instructor-related interaction activity. Again, this fits with the playground metaphor and our previous analysis. In Chapter 4, we determined that learner-instructor interaction would be limited to answering questions, providing evaluative feedback, and monitoring student activity. Children on a playground feel safe with adult supervision, but they don’t welcome adult intervention unless there is trouble. Extending the metaphor, the learner would look for the instructor to facilitate their interaction with the content and may need administrative support, but relational interaction would likely not be featured. Of course, as with children on a playground, some learners may seek out a relationship with the instructor, so you may want to consider tools to enable one, at learner discretion. Translating design elements into course activities provides a framework for determining our media need. That helps to narrow the scope of tools to consider in media selection. Revisit the lists of sample tools in Tables 5.3 5.5 to refresh on the connection between needs and tools. At this point, you can choose the tools yourself, or consult an educational technologist who can guide you to tools that will best meet the needs you’ve identified. Suggestions for tools to be used in the course depicted in Fig. 5.1 are included in Table 5.7. To once again demonstrate the power of the metaphor in framing your course redesign, let’s look at how using the metaphor of the symphony alters our course activities and media needs.

Extension of Symphony Metaphor The symphony metaphor implies collaborative effort, organizing courses around socially constructed knowledge units. The metaphor casts learners as both information resources (experts in playing their individual instruments) and collaborators (it takes an orchestra to play a symphony).

106

REDESIGNING COURSES FOR ONLINE DELIVERY

Instructors, like conductors, would mainly facilitate the learning (music) through coordinating peer interactions, providing guidance for content activities, and assisting individual learners in recognizing and remediating performance weaknesses. Once again, we’ll revisit decisions made during previous phases of redesign to help us identify course activities to be enabled by media. Notice that the learning objectives differ slightly under this metaphor. For peers to be cast as information resources and/or collaborators implies they have a base of knowledge upon which to draw. Table 5.8 synthesizes the design and interaction choices we made for a higher education course in organizational communications using the symphony metaphor. Graphically Organizing the Course As we discussed in Chapter 4, the symphony metaphor lends itself to integrated rather than separate learning modules. Unlike the playground that Table 5.8.

Revisiting Redesign Decisions for the Symphony Metaphor.

Objective

Learners will effectively respond to situations involving customers, managers, and peers, identifying the purpose and audience to be addressed using a format, writing style, and message strategy that would successfully resolve the situation.

Share

(1) Describe importance of identifying purpose and audience and researching both. (2) Review various formats for messages and how they are best applied. (3) Outline steps of message construction, providing clear connections back to purpose.

Illustrate

(1) Show sample messages, highlighting how purpose and audience align with information provided and message tone. (2) Contrast poorly constructed messages, demonstrating where the errors in approach are most apparent. (3) Examine classical examples of message outcomes, highlighting positive and negative consequences.

Guide

(1) Reinforce need for appropriate writing style and sound mechanics through assessment and practice. (2) Distribute scenarios for learners to collaboratively analyze in order to determine the purpose, audience, and best approach to resolve the situation. Written responses could be authored by individuals or by teams. (3) Organize message construction process by setting dates for staged reviews.

Nurture

(1) Facilitate class analysis by inviting discussion about other team’s responses to situations. (2) Direct discussion of potential consequences for each of the class responses. Encourage peer and self-assessment to identify performance gaps.

Media Considerations

107

presented modules as distinct endeavors, the symphony uses modules that are interdependent. Lack of development in any of the skills would result in poor overall performance. The course likely involves individual, team and class-level interactions similar to the way a symphony requires individual and sectional practice sessions before bringing things all together during a full orchestra rehearsal. Similarly, modules in the course will likely need to be completed in a given sequence. Revisiting the learning objective, we can identify five basic performance elements. Learners will (1) effectively respond to situations involving customers, managers and peers, (2) identify the purpose and audience to be addressed using a (3) format, writing style, and (4) message strategy that would (5) successfully resolve the situation. Fig. 5.2 depicts a diagram of the course objective, inspired by the metaphor. The course is now organized in 6 modules; the 5 primary performance elements are represented along with a box for individual skill development in support of the other elements. As with the playground metaphor, we gave each of the boxes a content label related to the performance elements and then filled them with action verbs related to instructional methods: share information, illustrate skills, guide practice, nurture progress. Unlike the design plan using the playground metaphor, the symphony metaphor places some action verbs in

Fig. 5.2.

Course Diagram of Learning Activities: Symphony Metaphor.

108

REDESIGNING COURSES FOR ONLINE DELIVERY

more than one module; instructor-related interactions even appear between modules. This reflects the integrated nature of the content as well as the interdependency of the modules. The next step is to translate those action verbs into course activities that will dictate media needs. Identifying Course Activities Table 5.9 presents the action verbs translated to course activities. This time there are content, instructor, and peer-related activities featured. Table 5.9. Course Activity

Course Activities and Media Needs: Symphony.

Action Verb

Media Need

Sample Tool

Informing

Describe Review Outline

Narrated slideware Audio annotated documents e-text

Haiku deck Adobe Pro (pdf) (e-text publisher)

Explaining

Examine

Screencast video

Camtasia

Showing

Show Contrast

Screencast images with audio Audio connected to documents with ability for discussion

Jing Voicethread

Engaging

Reinforce

Quizzes plus related review materials from across the web (or e-text). Self-paced practice activities related to review materials.

OERs or self-produced using Glo Maker

Coordinatinga

Organize

Means to display schedule and reminders

Google calendar with GMinder; Facebook fan page

Conferencinga

Direct

Web-conferencing with screen sharing and audio

Big Blue Button

Respondinga

Encourage

Skill rubrics for self and peer iRubric and Calibrated assessment (related to assignments). Peer Review

Networkinga

Facilitate

Means to connect learners and stimulate idea exchange.

Twitter

Means to facilitate peer interaction and collaborative writing.

Zoho Projects

Collaboratinga Distribute Simulating a

There are many collaborative tools that also have features to enable coordinating, conferencing, networking, and responding. Keep things manageable by using media that integrate multiple activities if possible. Consider consulting an educational technologist for help in identifying more robust media solutions.

109

Media Considerations

The more activities, the more complex the media needs. Media can often meet more than one need, but continued discipline in selecting tools will yield the best choices. Begin by articulating the need for each activity, and then consider consulting an educational technologist4 to find tools that will meet multiple needs. Keeping media manageable will enhance everyone’s experience. For the sake of illustration, sample tools to support each need are included in Table 5.9.

CONCLUSION This chapter focused on the media selection process, demonstrating how it flows from the design and interaction phases and is filtered through our chosen metaphor. Far too often, instructors will default to digitizing their traditional classroom approach, default to institutionally provided tools, or assume they lack the skills required to bring their course online. A better approach is to let pedagogy drive your choices. Matching interaction needs with the “right” tool is a matter of zeroing in on media needs using the course metaphor, identifying the course activities from action verbs used in the design, and connecting the activities to media needs. Throughout the course redesign process, we’ve stressed the idea that choices should fit the content, learners, and instructor. This is especially true when it comes to media selection. Technology choices are not neutral; they have consequences. Media influences who interacts with whom, or with what content, potentially restructuring course elements. Research indicates that learners have channel preferences that they bring with them to class. For instance, many prefer to email with instructors rather than to use social media, as it seems more appropriate to the nature of the relationship (Hewitt & Forte, 2006). Learners can also choose to employ media faithfully or ironically. When tools are used faithfully, they are employed by learners as intended; ironic use of tools means learners adapt them to some other purpose, which can introduce uncertainty and ambiguity into an already complex situation. Ultimately, the goal is to choose the simplest tool to enable as many of your course activities as possible. Plan for how the tools should be used, be prepared to offer training or set expectations for learners to gain the skill gained using outside resources. Consider potential consequences of tool choices and consider establishing a feedback loop so learners can alert you to any media issues that are constraining their learning experience.

110

REDESIGNING COURSES FOR ONLINE DELIVERY

Finally, you don’t have to go it alone. Use high-quality, preprepared materials whenever possible, these include tutorials for the various tools you choose. Consult technology experts for support; just stay true to your redesign decisions. No one knows your content, your learners, or your instructional style better than you do. With your media selected, you are well on your way to putting together your course. There’s just one more phase in the redesign process, but it’s one of great concern to learners, instructors, and society at large. The final phase is focused on considerations related to evaluation. In the next chapter, we explore evaluation of student learning, course quality, and instructor effectiveness as affected by online delivery.

NOTES 1. The teaching professor can be found at http://www.facultyfocus.com/topic/ articles/teaching-professor-blog 2. Two download speed calculators are available at: www.download-time.com or http://bandwidth.com/tools/calc.html 3. Educational technologists appreciate it when instructors come to them with clear media needs defined. This makes their job easier and ensures the needs of your learners and content are paramount in the media choice process. Consult with a technologist after you’ve done the preliminary redesign work yourself. 4. If you do not have access to an educational technologist, there are a number of quality blogs and web resources to assist you in identifying media features and ratings. Consult blogs at Educause, Edudemic, and the Faculty Ecommons to start.

REFERENCES Alavi, M., & Dufner, D. (2005). Technology-mediated collaborative learning: A research perspective. In S. R. Hiltz & R. Goldman (Eds.), Learning together online: Research on asynchronous learning networks (pp. 191 213). Mahwah, NJ: Lawrence Erlbaum. Anderson, L. W., Krathwohl, D. R., & Bloom, B. S. (2001). A taxonomy for learning, teaching, and assessing: A revision of Bloom’s taxonomy of educational objectives. New York, NY: Longman. Barley, S. R. (1990). The alignment of technology and structure through roles and networks. Administrative Science Quarterly, 35(1), 61 103. Beard, L., Wilson, K., Morra, D., & Keelan, J. (2009). A survey of health-related activities on second life. Journal of Medical Internet Research, 11(2). Retrieved from http://www.ncbi. nlm.nih.gov/pmc/articles/PMC2762804/. Accessed on July 5, 2013 Bloom, B. S. (1956). Taxonomy of educational objectives, Handbook I: The cognitive domain. New York, NY: David McKay.

Media Considerations

111

Bullis, C., & Bach, B. W. (1991). An explication and test of communication network content and multiplexity as predictors of organizational identification. Western Journal of Speech Communication, 55, 180 197. Chickering, A., & Gamson, Z. (1987). Seven principles for good practice in undergraduate education. AAHE Bulletin, 39, 3 7. Daft, R. L., & Lengel, R. H. (1986). Organizational information requirements, media richness, and structural design. Management Science, 32(5), 554 571. Daft, R. L., Lengel, R. H., & Trevino, L. K. (1987). Message equivocality, media selection, and manager performance: Implications for information systems. MIS Quarterly, 11(3), 355 366. Douglas, S. (2012). Student engagement, problem based learning and teaching law to business students. e-Journal of Business Education & Scholarship of Teaching, 6(1), 33 47. Driscoll, M., & Carliner, S. (2005). Advanced web-based training strategies. San Francisco, CA: Pfeiffer. Grant, M. M., & Cheon, J. (2007). The value of using synchronous conferencing for instruction and students. Journal of Interactive Online Learning, 6(3), 213 226. Helvie-Mason, L. (2011). Facebook, friending, and faculty student communication. In C. Wankel (Ed.), Teaching arts and Science with the new social Media (pp. 61 87). London: Emerald Group. Hewitt, A. & Forte, A. (2006). Crossing boundaries: identity management and student/faculty relationships on Facebook. Retrieved from http://citeseerx.ist.psu.edu/viewdoc/download; jsessionid = 868D59E247180E9FC80D6F415F8C8ECC?doi = 10.1.1.94.8152&rep = rep1& type = pdf. Accessed on April 12, 2011. Hoadley, C. (2013, June 10). Three laws of ed tech. The Lab Report. NYC Media LAB Retrieved from http://www.nycmedialab.org/blog/2013/06/ chris-hoadleys-three-laws-of-edtech. Accessed on July 6, 2013. Hoeft, M. E. (2012). Why university students don’t read: What professors can do to increase compliance. International Journal for the Scholarship of Teaching and Learning, 6 (2), 1 19. Retrieved from http://academics.georgiasouthern.edu/ijsotl/v6n2/articles/PDFs/Acc%20Art_ Hoeft.pdf. Accessed on July 4, 2013. Jung, I., & Choi, S. (2002). Effects of different types of interaction on learning achievement, satisfaction and participation in web-based instruction. Innovations in Education and Teaching International, 39(2), 153 162. Kinchin, I. (2012). Avoiding technology-enhanced non-learning. British Journal of Educational Technology, 43(2), E43 E48. Kirkley, S. E., & Kirkley, J. R. (2004). Creating next generation blended learning environments using mixed reality, video games and simulations. Tech Trends, 49(3), 42 53, 89. Little, B. B., Passmore, D., & Schullo, S. (2006). Using synchronous software in web-based nursing courses. Computers, Informatics, Nursing, 24(6), 317 325. Mazur, J. P., Murphy, R. E., & Simonds, C. J. (2007). I’ll see you on Facebook: The effects of computer-mediated teacher self-disclosure on student motivation, affective learning, and classroom climate. Communication Education, 56(1), 1 17. Mazur, J. P., Murphy, R. E., & Simonds, C. J. (2009). The effects of teacher self disclosure via Facebook on teacher credibility. Learning, Media and Technology, 34(2), 175 183. McBrien, J. L., & Jones, P. (2009). Virtual spaces: Employing a synchronous online classroom to facilitate student engagement in online learning. International Review of Research in Open and Distance Learning, 10(3), 1 17. McDaniel, D., & McDaniel, R. R. (2004). A field of study of the effect of interpersonal trust on virtual collaborative relationship performance. MIS Quarterly, 28(2), 183 227.

112

REDESIGNING COURSES FOR ONLINE DELIVERY

McKeachie, W. (2002). McKeachie’s teaching tips: Strategies, research, and theory for college and university teachers (11th ed.), Boston, MA: Houghton Mifflin. McLuhan, M. (1994). Understanding media: The extensions of man. Boston, MA: MIT Press. Merriam Webster Online. Interactive [Def. 1 & 2]. Retrieved from http://www.merriamwebster.com/dictionary/interactive. Accessed on July 2, 2013. Norbert, N. (1954). The human use of human beings: Cybernetics and society. Boston, MA: Houghton Mifflin. Parker, I. (2001). Absolute PowerPoint: Can a software package edit our thoughts? The New Yorker, 77(13), 76 87. Parker, R., & Ingram, A. (2011). Considerations in choosing online collaboration systems: Functions, uses, and effects. Journal of the Research Center for Educational Technology, 7(1), 2 15. Parker, R. E. (2003). Distinguishing qualities of virtual groups: An issue-oriented perspective. In R. Y. Hirokawa, R. Cathcart, L. Samovar,, & L. Henman (Eds.), Small Group Communication: Theory and Research. Los Angeles, CA: Roxbury. Rovai, A. P. (2002). Sense of community, perceived cognitive learning, and persistence in asynchronous learning networks. The Internet and Higher Education, 5(4), 319 332. Scheibehenne, R., Greifeneder, R., & Todd, P. M. (2010). Meta-analytic review of choice overload. Journal of Consumer Research, 37(3), 409 425. Schrader, S., Riggs, W. M., & Smith, R. P. (1993). Choice over uncertainty and ambiguity in technical problem solving. Journal of Engineering and Technology Management, 10(1-2), 73 99. Scott, C. R., Quinn, L., Timmerman, C. E., & Garrett, D. M. (1998). Ironic uses of group communication technology: Evidence from meeting transcripts and interviews with group decision support system users. Communication Quarterly, 46(3), 353 374. Sharp, A., & McDermott, P. (2001). Workflow modeling: Tools for process improvement and application development. Norwood, MA: Artech.House. Silberman, M. (1996). Active learning: 101 strategies to teach any subject. Des Moines, IA: Prentice-Hall. Sims, R. (2003). Promises of interactivity: Aligning learning perceptions and expectations with strategies for flexible and online learning. Distance Education, 24(1), 87 103. Smart, K. L., & Cappel, J. J. (2006). Students’ perceptions of online learning: A comparative study. Journal of Information Technology Education, 5, 201 219. Timmerman, C. E., & Kruepke, K. A. (2006). Computer-assisted instruction, media richness, and college student performance. Communication Education, 55(1), 73 104. Young, J. R. (2004). When good technology means bad teaching. Chronicle of Higher Education, 51(12), A-31 A-32. Zhang, D., Zhou, L., Briggs, R. O., & Nunamaker, J. F. Jr. (2006). Instructional video in e-learning: Assessing the impact of interactive video on learning effectiveness. Information & Management, 43(1), 15 27.

CHAPTER 6 EVALUATION CONSIDERATIONS

ABSTRACT Evaluation is the process by which we estimate how things should go, explore how things are going, and determine how things went in terms of course redesign. In this chapter, we examine formative and summative methods for assessing student learning and establishing teacher effectiveness and course quality. Evaluation is a subjective, value-laden process. To introduce the rigor needed to make it meaningful, evaluation should be multifaceted, planned in advance, made transparent to learners, and employ valid and reliable methods. Moving courses online presents both opportunities and challenges for evaluation. We explore ways to implement assessment to make full use of the advantages of technology while mitigating the problems associated with online delivery. Keywords: Course redesign; online education; course evaluation; assessment; student feedback; online evaluation

INTRODUCTION Decisions made throughout the redesign process have all focused on bringing about changes: changes in our learners, changes in our course, and changes in ourselves, as instructors. Learning is a change process. Through it, learners develop new proficiencies; their experiences should leave them enriched changed for the better. The same holds true for our redesigned course. Our deliberate attention to key considerations in the design, interaction, media, evaluation (DIME) model should result in an improved experience that makes the best use of technology, increasing student access 113

114

REDESIGNING COURSES FOR ONLINE DELIVERY

and introducing teaching and learning efficiencies, without sacrificing on quality. To ensure we arrive at our destination, we turn our attention to the final phase of course redesign, and making choices related to evaluation. We have already been making some evaluation-related choices during the other phases of course redesign. Now, we need to formalize those assessment decisions. In this chapter, we use the terms evaluation and assessment interchangeably, which occurs often in the literature (see Bartley, 2006 for relevant discussion). We feature the term evaluation in the model, but, given our purpose, will not distinguish it from assessment. During this phase of redesign, we focus on systematic processes by which we collect and interpret data to determine three things: (1) how things should work in our course, (2) how things are working in our course, and (3) how things worked in our course. Data may be used to inform decisions at a variety of levels. Using the DIME model of course redesign, we investigate evaluation at the student, instructor, and course levels.

Types of Evaluation Assessment data may be used for formative or summative purposes. Formative data is used to inform decisions made before and during a learning experience, which provides answers to questions about how things should work and how things are working. Formative data offers the opportunity to make adjustments to the learning experience as it unfolds. Summative data is used to answer questions about how things worked; it is most often collected at the end of a learning activity or course. Summative data from one learning activity may become formative data for the next. Formative Assessment Formative evaluation can assist you in knowing your students, enabling you to retrofit your design, interaction, and media decisions to a particular set of learners. For instance, the administration of surveys that assess learner readiness, the giving of pretests to determine learner proficiency in key skills, or the design of orientating activities that gauge learner autonomy can all guide your instructional approach and answer questions about how things should work. Formative evaluation can also be used to assess things along the way to determine how things are going. For instance, the review of student activity reports can help gauge learner participation. The administration of intermittent attitude checks can identify the need for instructor

115

Evaluation Considerations

intervention, and the use of low- or no-risk assignments can assess learner understanding of course concepts. Together, formative assessments can provide a wealth of information about how students are experiencing a course while they are enrolled and there is still time to make changes. Competency tests may also be formative when they are used to examine learner mastery of knowledge and skills that will be needed for later tasks, assuming the potential for remediation. Competency tests are often used to ascertain learner achievement of the learning objectives. When used this way, they are a source of summative data. Summative Assessment Summative evaluation collects information to ascertain how things went. Assessments that focus on student learning are performance-based. They are designed to answer questions about how much progress learners ultimately make toward meeting the course objective(s). Summative data may also reflect learners’ levels of satisfaction with the course as well as outlining specific actions students took in support of their own learning. Together, this data can be used in making decisions about revising the course for the next session.

Nature of Assessment Assessment is a subjective, value-laden activity. Decisions about what data to collect, how to collect it, and how to use it are filtered through the perspectives of decision makers; they determine which activities are worthy of evaluation and which are not. In the case of course redesign, the instructor is the decision maker. In designing and implementing evaluation, the aim is to make it rigorous and transparent, but it won’t be neutral. Attention should be paid to reliability and validity of methods to ensure that evaluation strategies actually measure what they are designed to measure. Even with that, the National Research Council (2001) argues that there is no way to truly ascertain what a learner knows. The best we can do is estimate, using quality tools to help. The quality of the tools directly affects the quality of the information gathered; evaluation needs a systematic approach. Decisions about evaluation need to be made reflectively, as they will likely affect both learner and instructor behaviors. The nature of the assessments, and how they will be used, has been found to strongly influence learner behavior. Biggs (1985) reports on a series of

116

REDESIGNING COURSES FOR ONLINE DELIVERY

studies that examined factors that influence the learning strategy students apply. The strategies were organized in three levels: surface, achieving, or deep. Deep-level learning is indicative of the greatest effort by learners and results in the integration of new knowledge and skills into learners’ permanent repertoires, so it can be drawn on and applied across future contexts. Surface-level learning is a strategy used to meet short-term expectations, for instance passing a test. Any change in the learner will be temporary. Achieving-level strategies are used by students motivated more by attaining the status that comes with earning high grades, rather than a desire to really learn the material. They behave as model students, but there may be no permanent integration of the skills and knowledge that come with deep-level learning (Marton & Saljo, 1976; Ramsden, 2005). One of the factors found to influence the strategy learners employ is the way the task will be evaluated. Task will be evaluated; task evaluation, together with personal and motivation factors, influences actions and effort (Biggs, Kember, & Leung, 2001). In addition to affecting student behavior, evaluation also influences instructor behaviors. Outcomes from formative and summative evaluation affect instructional practice. In order to make good decisions, evaluation needs to be well designed and relevant to your learning objectives as well as to your learners. Given the purpose, power, and subjectivity of evaluation, it is essential to be reflective and multifaceted in your approach. Consider the questions you need to answer and the methods you’ll use to answer them, as part of your redesign process. The inherent lack of neutrality in evaluation is not in itself problematic, but it is important that the subjectivity is recognized, the approach to evaluation is balanced, and that evaluation is purposeful and meaningful. As you finalize your redesign choices, develop an evaluation process that provides the information you need to make good decisions. In this chapter, we investigate the assessment of student learning and explore its implications for instructor effectiveness and overall course quality.

EVALUATION IN ONLINE COURSES As we move courses online, we need to rethink ways to conduct evaluation. As with the other instructional elements, we don’t want to simply digitize the old approaches, we need to rethink things. As we’ve shown throughout

Evaluation Considerations

117

the redesign process, the nature of the online environment and the learning it enables are different and require new approaches to teaching and learning. We open our discussion of evaluation with an exploration of the advantages and challenges that accompany evaluation in online courses.

Changes to Evaluation Processes The general purposes of evaluation remain unchanged for online courses, but technology does enable new forms of evaluation. It also presents some unique challenges. For instance, during the design phase in Chapter 3, we established that online environments lend themselves well to a “learn by doing” approach. This makes student learning more visible through the production of project-based outcomes. These outcomes may better estimate student competencies than a score on an examination, which is an advantage. However, students may not put enough time into completing the projects or they may perceive there to be insufficient time or support to get them done. Studies report that online students put in significantly less time than students attending face-to-face classes. According to Horspool and Lange (2012), online students spend an average of just 3.2 hours per week on their course work as compared to and average of 5.53 hours by classroom-based learners. Fewer hours don’t necessarily translate to poorer student effort or learning, however these statistics bear some consideration when developing course assessments. In the remainder of this section, we identify and explore a series of advantages and challenges of online evaluation. Generally, challenges can be addressed using technology-enabled advantages. Advantages of Online Evaluation Many learning management systems have robust reporting tools that track student behavior, for instance, what materials they opened, when, and for how long. Depending upon your choice of technology, this information can be distilled into reports or placed on a dashboard for instructors use. Student activity logs can be particularly useful in spotting learners who may be at risk due to inactivity, something we rarely have access to face-toface. Logs also provide clues about learner experiences in relation to ease of course navigation and the perceived value of content. This information is useful in making course revisions. The content of class discussions is archived in online courses. This data is generally unavailable in face-to-face courses. The archives allow for

118

REDESIGNING COURSES FOR ONLINE DELIVERY

contributions to discussions to be analyzed and used as part of student performance data. If “participation” is a course expectation, there can be more transparency in terms of quality and quantity of contributions expected and also in terms of scoring. Archived discussions also make participation more visible, allowing learners to better see their ongoing contributions and compare them to their peers. Rubrics are recommended when scoring participation; we’ll revisit the idea of rubrics in the section on integrating evaluation later in the chapter. Challenges of Online Evaluation One of the challenges to online evaluation is the lack of direct observation of learners. Without the chance for direct observation, students can easily misinterpret assignments or fall behind on projects without realizing it. When they become aware, they may quickly attempt to replicate sample projects rather than engaging in deep-level learning. According to Ramsden (1997), insufficient time, lack of perceived support, lack of readiness, or previous rewards for “replicating” assignments contribute to the use of surface- rather than deep-level learning. Surface-level learning may go undetected by evaluation methods that measure only surface outcomes. While this is not solely a problem in online courses, the lack of face-to-face meetings reduces opportunities for detection and correction along the way. Another challenge to online evaluation is the increased opportunities for learner academic dishonesty. There are two prime concerns related to academic dishonesty, which seem to intensify for online courses: maintaining security of exams (to avoid cheating) and increased plagiarism (Le Heron, 2001). Exam security is easier to maintain in a traditional class with techniques such as using a separate answer sheet, so that actual test questions never leave instructor control. When learners are taking the exam remotely, questions can be copied and saved using screenshots, typed up in a separate document and saved, or in less restrictive systems, simply printed and passed around. Further complicating matters is the fact that instructors and learners do not gather in class to go over the exam. So, learners will need access to the questions in order to understand the feedback they receive. Yet, highquality exams are challenging to write, so creating multiple versions is time- and labor-intensive for the instructor. I find this to be particularly true with case study-based exams, which are my personal preference when assessing applied knowledge. Plagiarism also presents a problem. According to a recent Pew Research Poll of college presidents, 55% report an increase in plagiarism over the

Evaluation Considerations

119

past 10 years. Of those reporting an increase believe that computers and the Internet have played a major role (Pew Research Center, 2011). A recent posting to the Chronicle of Higher Education Technology blog reports several incidents of plagiarism in non-credit MOOC’s (Massive Open Online Courses), prompting surprise and an investigation by Coursera into how widespread the problem is (Young, August 2012). Other security challenges involve verifying that the learner is actually the one completing the learning activity, restricting student access to online materials or textbooks during exams, and ensuring learners aren’t engaging in unauthorized collaboration while completing exams. Below I share strategies I have found to be effective deterrents of academic dishonesty. Rowe (2004) also provides a number of countermeasures that might be used to help control exam security. Strategies for Discouraging Academic Dishonesty It is essential to clearly articulate “the rules” for learners in advance. Setting expectations, and helping learners to understand the rationale behind them, facilitates compliance behaviors. In a study of 121 undergraduate business students completing an 11-item cheating questionnaire, the percentage of respondents who reported cheating behaviors to be inappropriate went up significantly when the instructor provided test-taking policies. For instance, without a specific policy by the instructor, only 10% of respondents thought it somewhat or highly inappropriate to use an open book during an online exam. When the instructor stated a policy of no books, 71% of respondents thought it highly inappropriate (King, Guyette, & Piotrowski, 2009). The use of multiple channels in setting expectations helps to ensure that all learners get the message. The use of video can be particularly effective. The richer nature of the channel enhances perceptions of teacher immediacy, which has been shown to increase rule compliance. For instance, Rocca (2004) established a positive link between teacher immediacy and student attendance. Other related research showed that students are more likely to comply with requests from instructors who are high in teacher immediacy and less likely to comply with requests from those low in immediacy (Kearney, Plax, Smith, & Sorensen, 1988). In setting expectations for my own exams, I include the rules and reasons in a weekly video address to learners; I also include a written version of the rules in the instructions to the exam itself. I let students know that their behavior will influence the ability for future students to take the exam in a place of their convenience; the alternative would be requiring students to go

120

REDESIGNING COURSES FOR ONLINE DELIVERY

to a testing center. I include specific instructions about what files learners can and cannot save on their computers, and at what point all of the files should be deleted from their computers. I also include explicit directions about whom, if anyone, they may speak with about the exam. Learners are advised at the start of the exam that in viewing the exam questions, they are indicating their acceptance of the terms. This seems to be effective. As previously discussed, making expectations explicit helps to reduce learner uncertainty and, at least in my experience, facilitates compliance. In developing online exams, it is difficult to prevent learners from using outside materials such as the Internet or textbooks during their exams. Rather than try to restrict their use, I have adopted a strategy of writing questions that require analysis, evaluation, and synthesis rather than just concept knowledge and comprehension (see Bloom’s taxonomy in Bloom, Engelhart, Furst, Hill, & Krathwohol, 1956). This frees me to encourage learners to utilize outside resources as they demonstrate deeper-level understanding of concepts in applying them to cases. If this is not a possibility for your content, a testing center or exam proctor could provide a solution to security concerns. There are also commercial firms such as Kryterion and ProctorU that provide remote proctoring services via webcam. For traditional multiple response kinds of exams, in addition to setting behavioral expectations, you might try question randomization. Narrowed release dates and shortened response times may help as well. Requiring learners to make an appointment to take the exam would alert learners about your awareness of the window of time in which they accessed the exam. Some Learning Management Systems (LMS) will allow you to vary release times by student, making this process more manageable. Unfortunately, some learners are devoting more time to working around test controls than in studying for exams. One student reported on a team approach to cheating using a shared Google Doc. According to The Chronicle of Higher Education Technology blog, students took turns adding questions to a document that they all could access. With the two attempts the instructor allowed on each test, even randomization of questions and restricted release times were no match for the students’ system. All were given grade A in the class; apparently, none had earned it (Young, June 2012). Learner instructor relationships, along with explicit expectation setting, seem essential components to reducing tendencies toward cheating. Plagiarism is another common problem in both online and traditional face-to-face environments. Students are equally likely to wait too long to get started on assignments, misunderstand what constitutes plagiarism, or suffer from a poor work ethic, all of which are common causes of

Evaluation Considerations

121

plagiarism. Strategies for addressing the root causes of plagiarism need to change in online courses. One approach to addressing this challenge is to require students to go through a tutorial on plagiarism, so they come to know what it is and why it’s important to avoid it. This clarifies the issue and sets expectations. Weekly surveys designed to gather questions from learners about their projects help to ensure they are getting started and provide information about any misunderstandings. Over time, surveys can provide clues about needed revisions to project instructions and/or support materials. Another popular strategy is to employ plagiarism-detection software. Many LMS systems have plagiarism-detection software in them. There are also stand-alone tools available; one of the most commonly used tools in the United States is Turnitin.com. Accessibility Considerations in Online Evaluation One additional consideration related to evaluation in online courses is the need to build them so students with learning disabilities can access them. Access needs may vary from enabling extended time on exams to compatibility with adaptive modes of delivery. For instance, learners who are visually impaired, with the aid of assistive technologies, have written exams read to them. At universities in the United States, it is common to have an office dedicated to providing learning accommodations for students. If your learners cannot come to campus, that office may be able to help in the development of exams that learners take remotely, which comply with accessibility standards. Moving courses online does present some unique opportunities and challenges for evaluation. In the next section we investigate how to integrate evaluation of student learning into course redesign.

INTEGRATING EVALUATION OF STUDENT LEARNING Revisiting Learning Objectives Begin the process of making choices related to evaluation by revisiting the learning objectives set during phase one of course redesign. We used a procedure for writing objectives that had strong implications for evaluation. The objectives were written to meet four criteria. They were to

122

REDESIGNING COURSES FOR ONLINE DELIVERY

be (1) observable, (2) measurable, (3) attainable, and (4) specific. We’ll briefly go over the aspects of the criteria most relevant to evaluation. For a full review of the process, revisit Chapter 3. Objectives generally include summative evaluation; formative assessments should also be considered as they can help learners succeed on summative tasks. Strong Objectives are Observable In chapter 3, we introduced the need to articulate learning objectives so they feature student actions that can be observed or so that the results of their actions are tangible. In other words, objectives should involve some behavior that students must perform in order to produce the desired results. Let’s look at a sample objective. For example, we might set the objective that learners will demonstrate the ability to assess the credibility of sources of evidence to generate a list of reliable sources for a (course-specific) research project. We can’t see our learners assessing the credibility of sources, but we can evaluate their skill in doing so based on the quality and number of entries on the reference list they create. This is how we make the outcomes observable, even when we are not able to actually observe our students engaging in the behavior. As written, this objective only meets part of the criteria for strong objectives. So far, we’ve indicated the knowledge and skills that will need to be evaluated as part of student learning. As we strengthen the objective, the additional detail will have further implications for evaluation. Strong Objectives are Measurable & Attainable In order to estimate whether learners ultimately meet the objective, we must establish criteria for measuring their progress in advance. As discussed in Chapter 3, decisions about performance standards should be based upon who our learners are, what skills they already possess, and how much time they have to develop the needed skills. In our sample objective, we indicated that learners will gain skills in assessing credibility of sources, but we did not indicate how many sources would be indicative of the desired level of achievement. To strengthen the objective, we would add those details. Adding to our sample objective, we might revise it this way: learners will demonstrate the ability to assess the credibility of sources of evidence by reviewing a list of instructor-provided resources and selecting the 10 most credible to be included in a list of reliable sources for a (course-specific) research project with 80% accuracy. Assessment of student learning would come from comparing their choices to an answer key. The closer the match between the learners’ reference lists and the key, the stronger their skills. Measurability of objectives ties directly to evaluation.

Evaluation Considerations

123

Strong Objectives are Specific Objectives become more specific as we add information to make them observable, measurable, and attainable. Sometimes we need to add even more details to ensure the objective depicts exactly what we want our learners to know or do and to what level of difficulty. For instance, in the example above, learners will not be finding their own sources, but how difficult will it really be for them to spot the 10 most credible sources? We might bring that into focus by determining how many sources will be provided for their review and how long they will have to complete their evaluations to bring this into view. In revising our objective once again, it would read: Learners will demonstrate the ability to assess the credibility of sources of evidence by analyzing a list of 25 instructor-provided resources using the American Library Association’s Guidelines of Information Literacy and identifying the 10 most credible to be included in a list of reliable sources for a (course-specific) research project within 48 hours with 80% accuracy. This objective is more specific and therefore more indicative of what students will actually be able to do. It also makes explicit what the summative evaluation will consist of. This more specific objective also provides clues about formative assessments that could be included in support of student learning. These will also be revealed in the instructional methods chosen as part of the design phase. Revisit Instructional Methods Later in phase one of course redesign, we used the objectives to inform our choices of instructional method through a set of supporting decisions. Those decisions centered on: information to be shared with learners, skills to be illustrated for them, means provided for them to practice the skills, and ways to ensure the learning was carried forward (called nurturing progress). Consult Chapter 3 for a richer discussion of Design decisions. We’ll briefly go over the aspects of these decisions that are most related to evaluation. Ultimately, both instructional method, and the means by which it will be conducted, will influence formative and summative evaluations of student learning. Evaluation of Information Shared As we previously discussed, students will need to be provided with some information about the knowledge and skills they are to build in order to achieve the learning objectives. Using our sample objective, we might use narrated slideware to describe the American Library Association’s Guidelines

124

REDESIGNING COURSES FOR ONLINE DELIVERY

of Information Literacy and display an example of a website that meets those guidelines using a screencast to highlight how it meets the criteria. In order to move learners toward the learning objective, we could employ formative evaluations to determine if students understand the information that was shared. In this case, we could use surveys that test learners’ comprehension of the guidelines. Consider strategies for discouraging academic dishonesty as you develop the surveys. The results of these assessments could be used to evaluate knowledge competencies that will be needed later in the summative assessment. Trend results from these evaluations could be used to determine whether the class is ready to move on or needs more instruction as a whole. Individual learners could be directed to additional content if they haven’t yet grasped the concepts. Evaluation of Skills Illustrated Learners will benefit from seeing how competencies should be applied to accomplish the learning objective. Competencies are the skills, abilities, and knowledge needed to perform a task (Jones, Voorhees, & Paulson, 2002). To achieve our objective, learners will need to see how the guidelines can be applied in evaluating the reliability of various types of sources. In Chapter 5, we identified Voicethread as a means to talk through good and bad examples; this tool also allows for learners to ask questions within the thread. Given our learning objective, we might show how the guidelines help us identify reliable and unreliable sources. Formative assessment might take the form of instructor analysis of questions posed afterwards by learners. Evaluation of Guided Practice Practice ensures the learners are actively learning how to apply the concepts and guidelines to achieve the learning objective. In the case of assessing the relative credibility of sources from a list, learners would need to conduct the kind of analysis previously demonstrated for them. This indicates that assessment of student learning should center on the performance of authentic tasks. In other words, they should be able to demonstrate competency in evaluating the credibility of actual sources. One way to carry out evaluation of practiced skills is to assign a set of materials for learners to analyze along with a list of the guidelines to apply. Formative evaluation might take the shape of self-review or peer review of work, guided by a rubric. Instructor feedback is particularly valuable when it comes to skills practice. Considerations related to the method and frequency of feedback should include the nature of the

Evaluation Considerations

125

assignment and number of students in the class. We explore the issue of scalability in our discussion of additional evaluation methods later in the chapter. Evaluation of Nurtured Progress Ultimately, progress is nurtured through feedback on evaluations, so long as the evaluation is designed to aptly measure achievement of the learning objective. This step will entail the summative evaluation depicted in the learning objective, but it may also entail others forms as well. For instance, learners may analyze the list of 25 resources, choosing the 10 most credible for their reference lists. Lists would be compared to the key; learners’ lists should match at least 80% of the entries on the key. Learners who can perform to these standards would have met the objective. However, for those who didn’t, there would be no information about what went wrong to nurture their future progress. Including an assessment activity in which learners explained their choices would enrich the information available to the instructor for the purposes of feedback.

Incorporate an Array of Evaluation Methods As previously suggested, evaluation estimates student learning more accurately when it is multidimensional. In other words, employing a variety of assessment methods will better capture information about student learning and bring the data together for a more complete picture. Tests, discussions, real world applications, student reflections, teams, peer review, and participation are all useful means of evaluation in online classes. Rubrics are essential to the process as they communicate expectations, provide transparency, and aid in overall student achievement. There are many more methods for estimating learner competencies than we’ll explore here. Angelo and Cross’s (1993) Classroom Assessment Techniques: A Handbook for College Teachers is still one of the best resources with methods easily adaptable to online classes. Consider Knowledge-Based Evaluation Testing learner knowledge of concepts may be used as formative assessment, summative assessment, or both. We previously discussed ways to maintain security for online tests such as randomizing questions or administering several versions. There are other factors in using tests that should be considered as well. Kelly and Haber (2006) recommend beginning with the purpose.

126

REDESIGNING COURSES FOR ONLINE DELIVERY

For instance, an estimate of concept knowledge can be made using linear questions and open response questions (linear being multiple choice, true & false, and matching; open response being fill in the blank, short answer, and essay). The approach you take will depend upon other considerations such as the length of time intended for the exam, the grading system, and the scoring mechanism. These considerations will all impact scalability. If time is limited or you have a large enrollment course, you may want to limit open response items or consider alternate scoring mechanisms. Jordan (2012) recently investigated how learners engaged with e-test questions and used the feedback provided. She used observational methods by having students answer questions and provide explanations about their approach. Factors like question wording and instructions (such as expected word length) influenced learner engagement as well as whether the assessment was formative or summative. Jordan also found that feedback was used in more detail when learners were told that their responses were incorrect. Also, the more detailed and tailored the feedback was, the more useful students found it. Linear and open response tests can work as estimates of concept knowledge, but if you want to estimate learner skill in applying that knowledge, then performance-based assessments will be more effective (Kelly & Haber, 2006). Performance-based assessment might take the form of creating a product, such as the reference list in our sample objective, or an interactive simulation. We explore ideas for performance-based assessment below. Consider Performance-Based Evaluation Performance-based assessment of skills is more likely to be summative in nature. According to Kelly & Haber (2006), performance-based evaluation may be observational, mechanical, or an interactive simulation. Observational assessments involve watching the learner perform a skill. For instance, the learner may give a live presentation during an online class meeting. Mechanical assessments involve creation of a product such as the reference list in our sample learning objective. Simulations involve learners operating within virtual contexts, demonstrating skills in responding to content relevant circumstances. Performance-based assessments tend to be multilevel in nature. Generally, performance requires mastery of concept knowledge to support skill competencies. Portfolios are popular multidimensional assessment tools. They frequently include work samples (mechanical assessment), video demonstrations (observational assessment), and learner interpretations of portfolio contents (knowledge assessment). Case studies also lend

Evaluation Considerations

127

themselves to multilevel evaluation. They frequently ask learners to analyze a situation using a particular set of concepts (knowledge assessment) to generate a response such as recommendations for resolving a problem (mechanical assessment) that they formally present for feedback (observational assessment). As discussed previously, assessment is not neutral; it is a judgment call. Instructors use their knowledge and values to define levels of performance. By making standards explicit, subjectivity in lessened. Rubrics are a good tool for crystallizing performance expectations and then communicating them to learners. Morgan and O’Reilly (2006) argue that creating a detailed breakdown of standards of achievement (rubric), which is conveyed from the start, is essential for an effective online assessment. Rubrics break an assignment into component parts and then describe what constitutes performance at various levels for each part (Stevens & Levi, 2005). “Parts” may refer to distinct tasks comprising the assignment, or the term may refer to separate competencies, which collectively result in performance. In this case, the individual competencies would form separate dimensions upon which learning is to be evaluated. Rubrics are usually formatted as tables with a row for each of the dimensions and enough columns to accommodate defined levels of performance, one level per column. Four or five levels of performance are common. Seeking to develop a rubric with strong inter-rater reliability, Newell, Dahm, and Newell (2002) found four levels to be most effective in the assessment of student portfolios. Four levels eliminated a neutral answer and forced evaluators to make a more positive or negative ranking. They also found it easier to make the performance-level descriptions more distinct. Rubric samples that set out the National Educational Technology Standards for Students also use four levels: initiating (attempting), approaching (working), meets (achieving), and exceeds (leading) (Kelly & Haber, 2006). Rubrics developed for my own classes use anywhere from 3 5 levels per dimension, depending on the complexity and relevance of the competency to overall performance. More complex tasks have more layers of performance. The key is to ascertain that your rubric captures distinctions in performance in such a way that you as the instructor understand them clearly and can articulate them meaningfully to your students. If you use common assignments, rubrics should be able to be applied consistently across classes and represent the levels of performance in an agreed upon fashion. Next we turn our attention to other forms of performance-based evaluation that provide formative data about influences on the learning process in addition to summative data about learning outcomes. Expectation

128

REDESIGNING COURSES FOR ONLINE DELIVERY

setting will play a role in the quality of information gathered. Student performance may benefit from the use of an effective scoring rubric. Consider Additional Forms of Evaluation In addition to the more purposeful evaluations we’ve been discussing, there is often a vast amount of assessment data that relates to student performance available in the electronic archives of online courses. In this section, we look at three potential sources: threaded discussion, peer reviews, and information surveys. Each can be used to answer questions about how things are going. Together, they hold clues about how things went. We begin by examining threaded discussions and the information they may provide along with research-informed practices for their effective implementation. Threaded discussions Records of threaded discussion can reveal learner reactions to course content, signaling levels of comprehension of and engagement with content. Records can also reveal the strength of the learning community through learner reactions to one another. Threaded discussions often take place via blog, wiki, or LMS-supported discussion board. Within instructor-guided discussions, learner responses are nested beneath a prompt posted by the instructor. Learners respond to the prompt or other learners’ postings, often inspiring additional questions and comments. Instructionally, threaded discussions are an effective means to explore course concepts and their applications at a deeper level. Assignment and assessment of activities such as threaded discussion can help to facilitate learner engagement, but they need to be well designed and implemented. In a grounded theory study of learners enrolled in five online courses, learners reported that poorly structured discussion topics negatively affect their course experience and their response behaviors (Vonderwell, Liang, & Alderman, 2007). Surveys of students indicated that requiring learners to do three things helps make peer posts more useful to other learners. Respondents suggested that posts should: (1) make clear that learners have read the content, (2) respond directly to the post of at least one peer, and (3) add a question or new information to move the discussion forward. Ultimately, learners found rubrics reflecting those expectations to be conducive to performance and student learning (Vonderwell et al., 2007). In a study of 87 graduate students, Gilbert and Dabbagh (2005) found similar results. Facilitator guidelines and evaluation criteria made for more meaningful student posts. However, setting requirements such as word counts and citation rules appeared to impede the quality and average number of posts submitted.

Evaluation Considerations

129

Discussions can be used as formative or summative evaluation. They are useful in assessing the level of engagement of individual learners, determining the strength of the learning community, and identifying any gaps in student learning that might be filled by additional instruction. They are an excellent means to answer questions about how things are going while the course is in progress. They may also hold clues about how things went to inform what should be done in the future. Peer reviews Peer reviews are another way to evaluate how things are going. We briefly touched on peer evaluation in our discussion of peer interaction and collaboration in Chapter 4. Peer accountability facilitates behaviors in support of shared goals. In the case of student teams, those goals should support learning. Utilizing a peer accountability process such as Rate Your Mate™ (“mate” refers to teammate) provides a wealth of data about how things are going (Parker & Coykendall, 2012). Through the Rate Your Mate process, teams identify shared goals and establish behavioral expectations. Later, a structured review process provides developmental and evaluative feedback to each teammate to enhance personal and group performance. Peer accountability facilitates collaboration and the peer reviews supplied through a process such as Rate Your Mate give the instructor a view to how things are going for each team, providing an opportunity to intervene when teams need additional support. Peer review can also take the form of learners providing feedback to peers on the quality of their assignments. In developing a typology of peer assessment, Topping (1998) defines it as, “an arrangement in which individuals consider the amount, level, value, worth, quality, or success of the products or outcomes of learning of peers of similar status” (p. 250). This form of peer review would likely be used in online classes when peers are cast as information resources rather than collaborators (see Chapter 4 for a full discussion). This type of peer review will likely improve the learning of both the recipient and the reviewer, especially if both are trained in the process. In analyzing the perceived helpfulness of comments provided by peer learners versus experts (i.e., instructor), Cho, Schunn, and Charney (2006) found that directive comments (ones that provided a specific critique along with a suggestion for improvement) and praise comments were most helpful to learners. Cho et al. found the experts provided far more directive comments than any other type of comment; they also provided far more directive comments than did peers. On the other hand, peers provided more praise comments than did experts. Praise comments appeared to provide motivational effects, making it more likely that learners would revise multiple times.

130

REDESIGNING COURSES FOR ONLINE DELIVERY

Peer assessment of assignments guides practice and aids in student learning. It also provides information useful to instructors in evaluating how things are going. The review of instructor and peer evaluations can provide clues about skill gaps and assignment misconceptions while there is still time to address them in class. Information surveys One other assessment technique that can help answer questions about student learning, as it is happening, is information surveys. Surveys can take many forms, but one that is particularly useful in answering questions about how things are going is Brookfield’s (1995) Critical Incident Questionnaire (CIQ). The CIQ asks learners five questions designed to get at what’s working, what’s not, and why/how. For example, one of the questions asks, “At what moment this week did you feel most engaged?” Another asks the inverse, “When did you feel distracted?” Another question asks, “What action of others was helpful?” Another asks, the question again, substituting the word puzzling for helpful. As an instructor, I find I get valuable information about what I may be doing that is helpful/puzzling, by using Brookfield’s CIQ (1995). Using surveys such as the CIQ tends to set expectations among learners that something will be done to address the puzzling or disengaging elements. So I only use it when I intend to act. Use the CIQ in moderation. It is most useful as formative assessment, but may be combined with information from other evaluation techniques to answer questions about how well things went, providing a picture of student learning, overall. Further Considerations on Evaluating Student Learning In assessing student learning, we looked to answer questions related to how things should go, how things are going, and how things went. We looked to our learning objectives and were guided by the predetermined outcomes. We also looked to our instructional methods; methods will strongly influence choices as many have evaluation already built in. Interaction decisions will also hold sway. If you’re using peers as information resources or collaborators, you will want to consider peer evaluation strategies. Previous media selection decisions may also suggest evaluation methods; methods of evaluation we choose may dictate additional media choices. For instance, if we opt to give online knowledge tests, we may need a tool with which to deliver the questions and feedback. If you use an LMS, it may have a test tool built in. You might also consider independent testing tools such as QuizStar, ClassMarker, Hot Potatoes, or an interactive form in Google Docs. If you decide to use peer review of individual assignments, you might consider a tool that will randomly assign the work to reviewers,

Evaluation Considerations

131

using a double blind system to reduce bias. SWoRD peer review by Panther Learning, Turnitin.com, or a publisher provided tool such as InSite offered by Cengage Learning all include the double blind review feature. Summative evaluation choices may also be enabled by technology. For instance, should you choose to assign a project such as a portfolio, learners can create and store them relatively easily in an online tool like Mahara. If learners want more freedom in their layout and look, WordPress might be a better choice. Finally, in choosing evaluation methods and setting performance criteria, the validity and reliability of tests and rubrics need to be established to ensure the information gathered is truly indicative of the intended student learning and that the measures do so consistently. Consider using pilot tests of surveys, field tests of exam questions, and expert reviews of rubrics to move you toward the goal of using valid, reliable measures. Analysis of learner data can provide additional information about the overall effectiveness of evaluation methods (see Moskal & Leydens, 2000 for a more thorough examination of this concern.) In the end, instructors will use this evaluation data to estimate student learning, but it can also be used in evaluating other aspects of the redesigned course. Next, we investigate how the data might be used in conjunction with additional data collection strategies to provide information about teacher effectiveness and overall course quality.

EVALUATING INSTRUCTOR EFFECTIVENESS AND COURSE QUALITY In this section we bring together instructor effectiveness and course quality. Instructor effectiveness focuses on assessing instructor skill in moderating and facilitating online courses. Course quality examines the overall learning experience and how well it brings about the desired change in learners while leaving them with a general sense of satisfaction. These are separate evaluation questions, but they are often answered using shared methods of data collection and, given our focus on course redesign, we are most interested in collecting data that is usable by instructors as they teach and revise courses for future semesters. Instructor effectiveness is grounded in interaction; decisions made during phase two of course redesign will influence instructor evaluation. Course quality is a global rating that represents how well the four sets of

132

REDESIGNING COURSES FOR ONLINE DELIVERY

redesign decisions come together to create a positive learning experience for students (DIME). Judgments about quality may vary by perspective; students, instructors, and institutions don’t always have the same agenda, so a multifaceted approach is essential to making a reasonable estimate of course quality (Sherry, 2003). Learner reviews will provide valuable information about how satisfied they were by the elements of the course, but they can be unreliable estimates of what worked. Research has established a correlation between grades earned in courses and student evaluations of teaching (SETs). Unfortunately, this does not necessarily mean that students have learned more. For instance, Johnson (2003) in an extensive review of the issue found support for both the leniency and reciprocity hypotheses. Leniency indicates that instructors who were more lenient graders were rated higher on SETs. Reciprocity was the granting of higher SET scores to instructors by students who received higher course grades (Clayson, 2004, 2008). Also of potential concern is the finding of a negative association between SETs and course rigor. Course rigor includes student perceptions of workload, difficulty of material, and the amount of time needed to complete assignments. Centra (2003) found student ratings were lower for more rigorous courses. Results such as these strongly suggest the need to use SETs as formative rather than summative evaluation in determining instructor effectiveness. There is value in assessing student satisfaction, but it differs in character from assessments of student learning. Learners may be dissatisfied for reasons other than their acquisition of new knowledge and skill. What learners need and what they like in online courses may not be aligned. In a meta-analyses of 59 published and 96 unpublished studies examining moderators of effectiveness of web-based instruction, Sitzmann, Kraiger, Stewart, and Wisher (2006) found many contradictions between conditions learners preferred and conditions that actually produced better learning outcomes. For example, learners were found to learn more, relative to classroom instruction, in courses that required high activity on their part, gave them high control, and had them engage in low human interaction. But learners were more satisfied when their required levels of activity were low, they had low control, and levels of human interaction were high. Learners were also more satisfied with courses that were shorter in duration, but they learned more from longer courses. Examination of both learning and satisfaction are needed. Satisfaction measures can help identify aspects of the course that might demotivate learners, lessening their effort and potentially contributing to attrition

Evaluation Considerations

133

(Levy, 2007). In assessing instructor effectiveness and course quality, data from learner satisfaction surveys and performance measures are important, but additional information will make assessments more robust in answering questions about how things are going and how things went. In the sections that follow, we consider three additional means to assess instructor effectiveness and course quality: evidence-based instructor self-reviews, peer observations, and external reviews.

Evidence-Based Instructor Self-Reviews Action research has long been used in analyzing and improving classroom instruction. It differs from formal research in that it’s grounded in the local context. Rather than testing hypotheses and generating findings that are generalizable across a larger population, action research seeks to answer questions specific to instructors and students in particular courses. It does so through a systematic process that often utilizes mixed methods of data collection and analysis (Stringer, 2007). Action research is an evidencebased process that is often used in program assessment as well as by individual instructors. According to Blumberg (2011), it is critical to use evidence in order to teach effectively. She argues that evidence needs to be both internal and external to the course. Internal evidence includes data about student learning; external evidence involves the integration of pedagogy and researchgenerated best practices into teaching methods. In her book, Teaching that Promotes Better Learning, Blumberg (forthcoming) suggests instructors collect information about their teaching from their own self-reflections. These can be added to the results of formative assessments given to students, the analysis of notes and feedback provided to students, and learner performance data. These sources of data can be used collectively to answer questions about how things went and how things should go next time. Instructor self-reviews can provide valuable insights about how well particular aspects of our courses are going. Those aspects will be determined by our approach to self-reflection; in other words, our self-reviews will answer the questions we see as important. As such, they will not be neutral. While that does not negate their value, it does suggest a need for triangulation of more objectively derived data to get a more complete picture of what’s going on. One possibility is to incorporate the use of peer observations.

134

REDESIGNING COURSES FOR ONLINE DELIVERY

Peer Observations Peer observations are well suited for assessing teacher effectiveness and overall course quality. Without synchronous sessions to attend, peer review will likely consist of examining course artifacts such as content modules, asynchronous discussions, and learner-generated materials to estimate about how things are working. In looking at instructor effectiveness, they enable the examination of learner instructor interactions from a nonparticipant point of view. In Chapter 4 we discussed three types of learner instructor interactions: administrative, facilitative, and relational. Measures of teacher presence are being developed, which incorporate behaviors such as directing instruction and facilitating discourse. As a reminder, teacher presence refers to the sense of closeness learners share with the instructor (Garrison, 2013). Behaviors incorporated in measures capture the essence of learner instructor interaction. For instance, Anderson, Rourke, Garrison, and Archer (2001) developed a tool for assessing teacher presence. They suggest it be applied to the discussion of transcripts to look for signs of efficacy in instruction and facilitation. Evidence of direct instruction would be behaviors such as providing feedback, injecting new knowledge into the discussion, and responding to technical problems. Evidence of facilitation would be behaviors related to setting the climate, drawing participants into the discussion, and consensus seeking. The inter-rater reliability for the measure ran between .77 and .84, with the stronger reliability associated with the manifest behaviors of direct instruction. There is still work to be done in establishing validity of the measures, but peer observers could use them to provide summative feedback about instructor effectiveness as well as formative feedback about course quality. Another approach is to examine learner’s expressed social presence. Social presence is a sense of psychological closeness with peers (Short, Williams, & Christie, 1976). The results of such an examination should reflect teaching effectiveness in that learner interactions with course content and peers are facilitated through instructor behaviors. Anderson, Rourke, Garrison, and Archer (2001) developed a measure for assessing learner social presence that looks at learner actions such as quoting from a peer’s post, continuing a discussion thread, addressing fellow learners by name, complimenting peers, self-disclosure, and use of humor. The measure features strong inter-rater reliability (.91 .95). Sunal, Sunal, Odell, and Sundberg (2003) bring together indicators of teaching presence, social presence, and course design in their Checklist for

135

Evaluation Considerations

Online Interactive Learning (COIL), which measures outcomes of online facilitation. This differs from both the Andersen et al. (2001) and Rourke et al. (2001) measures in that it looks at instructor and learner behaviors within the same discussion transcripts, capturing the essence of interaction. One potential limitation in choosing a more comprehensive measure such as COIL (Sunal et al., 2003) is the presumption that all three types of learner instructor interactions will be present. Which interactions are privileged will be a function of the content, learners, and instructor, as depicted by the metaphor. All types of interaction need not be present to result in a positive learning experience. The results of the Sitzmann et al. (2006) meta-analysis demonstrate this; lower interactivity courses were found to result in better learning, but lower learner satisfaction. Regardless of the processes peer observers choose to employ, instructors will want to inform them about their redesign choices. This may help observers choose the most apt methods, enabling feedback about both the effectiveness of redesign choices as well as their utility.

External Reviews There is a growing interest in developing quality standards for online courses in general. Quality Matters™ is a rubric-based, peer-review system that more than 700 institutions of higher learning are using to evaluate online courses as part of a continuous improvement process. The rubric provides weighted criteria related to learning objectives, instructional materials, learner interaction and engagement, course technology, assessment, and accessibility. Although there is concern that an externally generated review process may impinge on academic freedom, the rubric is written in such a way that it accommodates the host of choices we’ve made during the course redesign process. For instance, rather than prescribing what learner interaction and engagement should look like, as did the measure developed by Rourke et al. (2001), the Quality Matters rubric is written so it can be applied to evaluate the type of interaction intended by the course design.1 The following are the rubric items for learner interaction and engagement: • The learning activities promote the achievement of the stated learning objectives. • Learning activities provide opportunities for interaction that support active learning.

136

REDESIGNING COURSES FOR ONLINE DELIVERY

• The instructor’s plan for classroom response time and feedback on assignments is clearly stated. • The requirements for student interaction are clearly articulated.

Student Reviews Items for the other standards of quality are similarly phrased. By writing them in this way, the appropriateness of the redesign choices for the content and learners can be assessed along with the means for implementation. In answering questions about how things went, you may want to triangulate the results of external reviews with those of peer observers, when they examine actual course discussions. Add in instructor self-reviews and the result is a rich data set upon which to evaluate instructor effectiveness and course quality. There are costs associated with a subscription service such as Quality Matters. If they preclude your access to such tools, there are other means for external review. Teaching and course portfolios are options. The University of Michigan has assembled a variety of resources for creating portfolios.2 One resource that may be particularly useful is the national repository of course portfolios.3 Much of the information gathered during instructor self-reviews and peer evaluations could be included in a course portfolio and then sent out for external review. The state of Ohio has created a course review bartering system to help manage costs of external reviews. While such a system may not exist near you, you might consider asking associates from our professional organizations to review your portfolio.

Last Thoughts on Instructor and Course Evaluations Now that we’ve reviewed a number of evaluation options, we return to learner-supplied reviews. How learners perceive the instructor and course may ultimately speak to satisfaction, but student reviews can also provide actionable information along the way. University-administered end of course evaluation systems (SETs) provide summative responses to how things went, at least in theory. But, technology can enable anonymous student input during the course of the semester, providing information to instructors while there’s still time to make changes. Some tools allow you

137

Evaluation Considerations

to adapt the reviews to ask about the things you see as most important to your content and learners. One such tool is the Student Assessment of Learning Gains (SALG), an online instrument that asks learners about the degree to which a course enabled their learning. There are SALG-generated instruments or the instructor can create a baseline measure to gather information about learner competencies pre and post course. The results are then aggregated for the class. The instructor can see who has completed the instrument, but the responses themselves are anonymous to the instructor, protecting the learners and encouraging honesty. One additional resource for learner reviews of teaching that can be administered at the discretion of the instructor is the survey system called TooFast.4 Results are delivered in the aggregate and enable discussions and change processes during the course. Assessing teacher effectiveness and course quality should be as multifaceted as evaluation of student learning. Few instructors would implement a single measure of student achievement within a course. Instead, multiple tests, projects, and homework are administered to paint a picture of learning. So it should be for teacher effectiveness and course quality. The more and varied the data, the more accurate the estimate of how things worked and the more reliable the information upon which to make future decisions.

CONCLUSION The final phase of the course redesign process seeks to answer questions about how things should go, how things are going, and how things went. Formative evaluation gathers information about learners, content, and the instructor prior to the start of the course to inform how things should go. Formative assessments can also be used during the course, answering questions about how things are going, which can inform changes needed along the way. While formative assessment is important to the instructional process in general, it becomes more essential in online contexts where vicarious cues are unavailable; everything needs to be more explicit. Summative evaluation answers questions about how things went. They may be used at the close of a learning activity or as part of determining learner achievement of the learning objectives. Summative evaluations from one activity may provide formative data to the next.

138

REDESIGNING COURSES FOR ONLINE DELIVERY

Data about student learning may be used as part of formative or summative evaluations of instructor effectiveness and course quality. Methods such as SETs, instructor self-reflections, peer observations, and external reviews can also be used to estimate how things are going and how things went. Evaluation is a subjective, value-laden process. Those designing the process determine what is worthy of measurement and what is not. This does not mean the process is without rigor. A robust evaluation process, which is planned in advance, made transparent to learners and other stakeholders, and employs valid and reliable measures, can yield valuable information about student learning, instructor effectiveness, and course quality. In higher education, we measure for improvement (see Astin & Antonio, 2012 for a more complete discussion). Redesign considerations related to evaluation are intertwined with the other phases of course redesign. As previously discussed, we separate the considerations into the categories of DIME, but there is overlap. Decisions made during each redesign phase may affect previous choices and influence those made in future phases. Course redesign is an iterative process. Evaluation is the phase that solidifies our course plan and sets the standards by which we’ll determine our success in bringing about desired changes in our learners, our course, and ourselves. In the next chapter we revisit the highlights of using the DIME model for course redesign, examine additional resources, and suggest future directions.

NOTES 1. Available from www.qualitymatters.org 2. Available from http://www.crlt.umich.edu/tstrategies/tstpcp 3. Available from http://www.courseportfolio.org/peer/pages/index.jsp?what = searchD 4. Available from https://www.toofast.ca

REFERENCES Anderson, T., Rourke, L., Garrison, D. R., & Archer, W. (2001). Assessing teaching presence in a computer conferencing context. Journal of Asynchronous Learning Networks, 5(2), 1 16. Angelo, T. A., & Cross, K. P. (1993). Classroom assessment techniques: A handbook for college teachers (2nd ed.). San Francisco, CA: Jossey-Bass.

Evaluation Considerations

139

Astin, A. W., & Antonio, A. L. (2012). Assessment for excellence: The philosophy and practice of assessment and evaluation in higher education (2nd ed.). Washington, DC: Rowman & Littlefield. Bartley, J. (2006). Assessment is as assessment does: A conceptual framework for understanding online assessment and measurement. In M. Hricko & S. L. Howell (Eds.), Online Assessment and Measurement: Foundations and Challenges (pp. 1 45). Hershey, PA: Idea Group. Biggs, J. B. (1985). The role of meta-learning in study processes. The British Journal of Educational Psychology, 55, 185 212. Biggs, J. B., Kember, D., & Leung, D. Y. P. (2001). The revised two factor study process questionnaire: R-SPQ-2F. British Journal of Educational Psychology, 71, 133 149. Bloom, B. S., Engelhart, M. D., Furst, E. J., Hill, W. H., & Krathwohol, D. R. (1956). Taxonomy of educational objectives: The classification of educational goals. Handbook I: Cognitive domain. New York, NY: Longman-Green. Blumberg, P. (2011). Making evidence-based practice an essential aspect of teaching. Journal of Faculty Development, 25(3), 27 32. Blumberg, P. (forthcoming). Teaching that promotes better learning. San Francisco, CA: Jossey-Bass. Brookfield, S. D. (1995). Becoming a critically reflective teacher. San Francisco, CA: Jossey-Bass. Centra, J. A. (2003). Will teachers receive higher student evaluations by giving higher grades and less course work? Research in Higher Education, 44, 495 518. Cho, K., Schunn, C. D., & Charney, D. (2006). Commenting on writing: Typology and perceived helpfulness of comments from novice peer reviewers and subject matter experts. Written Communication, 23(3), 260 294. Clayson, D. E. (2004). A test of the reciprocity effect in the student evaluation of instructors in marketing classes. Marketing Education Review, 14(2), 11 21. Clayson, D. E. (2008). Student evaluations of teaching: Are they related to what students learn? A meta-analysis and review of the literature. Journal of Marketing Education. 31(1), 16 30. Garrison, D. R. (2013). Theoretical foundations and epistemological insights of the community of inquiry. In Z. Akyol & D. R. Garrison (Eds.), Educational communities of inquiry: Theoretical framework, research, and practice (pp. 1 11). Hershey, PA: IGI Global. Gilbert, P. K., & Dabbagh, N. (2005). How to structure online discussions for meaningful discourse: A case study. British Journal of Educational Technology, 36(1), 5 18. Horspool, A., & Lange, C. (2012). Applying the scholarship of teaching and learning: Student perceptions, behaviors and success online and face-to-face. Assessment and Evaluation in Higher Education, 37(1), 73 88. Johnson, V. E. (2003). Grade inflation: A crisis in college education. New York, NY: Springer. Jones, E. A., Voorhees, R. A., & Paulson, K. (2002). Defining and assessing learning: Exploring competency-based initiatives. NCES 2002-159. Washington, DC: Council of the National Postsecondary Education Cooperative Working Group on Competency-Based Initiatives. Jordan, S. (2012). Student engagement with assessment and feedback: Some lessons from short-answer free-text e-assessment questions. Computers & Education, 58, 818 834. Kearney, P., Plax, T. G., Smith, V. R., & Sorensen, G. (1988). Effects of teacher immediacy and strategy type on college student resistance to on-task demands. Communication Education, 37(1), 54 67.

140

REDESIGNING COURSES FOR ONLINE DELIVERY

Kelly, M. G., & Haber, J. (2006). National education technology standards for students: Resources for student assessment. Eugene, OR: International Society for Technology in Education. King, C. G., Guyette, R. W., & Piotrowski, C. (2009). Online exams and cheating: An empirical analysis of business students’ views. The Journal of Educators, 6(1). Retrieved from http://www.thejeo.com/Archives/Volume6Number1/Kingetalpaper.pdf. Accessed on April 8, 2013. Le Heron, J. (2001). Plagiarism, learning dishonesty or just plain cheating: The context and countermeasures in information systems teaching. Australian Journal of Educational Technology, 17(3), 244 264. Levy, Y. (2007). Comparing dropouts and persistence in e-learning courses. Computers & Education, 48(2), 185 204. Marton, F., & Saljo, R. (1976). On qualitative differences in learning. The British Journal of Educational Psychology, 46, 4 11. Morgan, C., & O’Reilly, M. (2006). Ten key qualities of assessment online. In M. Hricko & S. Howell (Eds.), Online assessment and measurement: Foundations and challenges (pp. 86 101). Hershey, PA: Idea Group. Moskal, B. M. & Leydens, J. A. (2000). Scoring rubric development: Validity and reliability. Practical Assessment & Evaluation, 7(10). Retrieved from http://PAREonline.net/getvn.asp? v = 7&n = 10 National Research Council. (2001). Knowing what students know: The science and design of educational assessment. Washington, DC: National Academies. Newell, J. A., Dahm, K. D. & Newell, H. L. (2002). Rubric development and inter-rater reliability issues in assessing learning outcomes. Published in the Proceedings of the 2002 American Society for Engineering Education Annual Conference & Exposition. Retrieved from http://search.asee.org/search/fetch;jsessionid=gmm6tgdthkgtl?url=file%3A%2F% 2Flocalhost%2FE%3A%2Fsearch%2Fconference%2F26%2FAC%25202002Paper904. pdf&index=conference_paper&space=129746797203605791716676178&type=application %2Fpdf%charset=. Accessed on July 6, 2013. Parker, R. E. & Coykendall, S. (2012, November). The rate your mate process: Facilitating collaboration through goal setting, shared expectations, and peer accountability. Paper presented at the Annual Meeting of the National Communication Association, Orlando, Florida. Pew Research Center. (2011, August 28). The digital revolution and higher education. Pew Internet & American Life Project. Retrieved from http://www.pewinternet.org/ Press-Releases/2011/The-Digital-Revolution-and-Higher-Education.aspx. Accessed on June 20, 2013. Ramsden, P. (1997). The context of learning in academic departments. In The experience of learning (2nd ed.). Edinburgh: Scottish Academic Press. Ramsden, P. (2005). The context of learning in academic departments. In F. Marton, D. Hounsell, & N. Entwistle (Eds.), The Experience of learning: Implications for teaching and studying in higher educationm (3rd ed., pp. 198 216). Edinburgh: University of Edinburgh. Rocca, K. (2004). College student attendance: Impact of instructor immediacy and verbal aggression. Communication Education, 53(2), 185 195.

Evaluation Considerations

141

Rourke, L., Andersen, T., Garrison, D. R., & Archer, W. (2001). Assessing social presence in asynchronous, text-based, computer conferencing. Journal of Distance Education, 14(3), 51 70. Rowe, N. C. (2004). Cheating in online student assessment: Beyond plagiarism. Online Journal of Distance Learning Administration, 7(2). Retrieved from http://www.westga.edu/∼distance/ ojdla/summer72/rowe72.html Sherry, A. C. (2003). Quality and its measurement in distance education. In M. G. Moore & W. G. Anderson (Eds.), Handbook of distance education (pp. 435 460). Mahwah, NJ: Lawrence Erlbaum. Short, J., Williams, E., & Christie, B. (1976). The social psychology of telecommunications. Toronto, ON: Wiley. Sitzmann, T., Kraiger, K., Stewart, D., & Wisher, R. (2006). The comparative effectiveness of web-based and classroom instruction: A meta-analysis. Personnel Psychology, 59, 623 664. Stevens, D. D., & Levi, A. (2005). Introduction to rubrics: An assessment tool to save grading time, convey effective feedback, and promote student learning. Sterling, VA: Stylus. Stringer, E. T. (2007). Action research. Thousand Oaks, CA: Sage. Sunal, D. W., Sunal, C. S., Odell, M. R., & Sundberg, C. A. (2003). Research supported best practices for developing online learning. The Journal of Interactive Online Learning, 2(1). Retrieved from http://www.ncolr.org/jiol/issues/pdf/ 2.1.1.pdf. Topping, K. (1998). Peer assessment between students in colleges and universities. Review of Educational Research, 68(3), 249 276. Vonderwell, S., Liang, X., & Alderman, K. (2007). Asynchronous discussions and assessment in online learning. Journal of Research on Technology in Education, 39(3), 309 328. Young, J. R. (2012, August). Dozens of plagiarism incidents are reported in Coursera’s free online courses. The Chronicle of Higher Education. Retrieved from http://chronicle.com/ article/Dozens-of-Plagiarism-Incidents/133697/. Accessed on July 24, 2013. Young, J. R. (2012, June). Online classes see cheating go high-tech. The Chronicle of Higher Education. Retrieved from http://chronicle.com/article/Cheating-Goes-High-Tech/132093. Accessed on July 24, 2013

CHAPTER 7 REPRISE AND FURTHER CONSIDERATIONS

ABSTRACT Course redesign is a creative process that involves the four sets of considerations set out by the DIME model. In this chapter, we highlight key considerations related to design, interaction, media, and evaluation and describe the interconnections of the decisions within the model that make the process iterative. In addition, we suggest supplementary matters for your consideration. Specifically, we explore matters related to career and course management. Career considerations are strategic level concerns related to course redesign that have potentially long-term implications. Course management considerations are tactical level suggestions aimed at making your course implementation a success. Issues and suggestions are grounded in experience. Keywords: Course redesign; online education; online course management; instructional design; intellectual property; promotion and tenure

INTRODUCTION If you have been redesigning your course as we went through the considerations, it’s time to implement it. If you waited to get started, the time to begin is now! Course redesign is a creative and fun process that involves the four sets of considerations set out by the DIME model. These considerations are design, interaction, media, and evaluation. We introduced the concept of an overarching course metaphor to focus redesign decisions on the particular needs of your content, your learners, and your instructional 143

144

REDESIGNING COURSES FOR ONLINE DELIVERY

style. This contextual relativist approach ensures course redesign decisions fit your particular situation.1 In this final chapter, we review the four phases of course redesign, highlighting key considerations and suggesting supplementary matters to consider as you prepare to implement your course. In addition, we explore future directions and other matters related to redesign for your contemplation. We begin where we first started our journey together, with a review of our redesign goals and the illuminating powers of metaphor.

METAPHOR REVISITED We came to the redesign process with a specific goal in mind, to create the kind of course we wanted for our students that would result in deep level learning and a satisfying experience for learners and for ourselves. We looked at the DIME model as a roadmap that would help get us to our destination. We had some fun along the way, much of it from playing with the revealing nature of metaphor. Our course metaphor has been a powerful lens through which to view the course redesign process. At the start of the process, we experimented with a variety of metaphors, to see what each revealed about instructional philosophy as applied to our content and our learners. We introduced the playground metaphor, which we used as an example throughout the phases of redesign. The playground presents learning in independent chunks, much like equipment on a playground. The instructor monitors learner activity. Learners will likely come to the course at different levels of preparation and engage with content as they are able. We contrasted the view of learners and content as seen through the playground metaphor with those reflected in metaphors such as a marketplace and a safari. Through the lens of a marketplace, content would need to be combined from across modules to create projects, much like food at various the various stands would be brought together to make a meal. The instructor would play a much more hands on role in facilitating connections. Through the lens of the safari metaphor, content would be developed to take learners on a highly focused journey such as making learners better managers. The instructor would play the role of guide. The metaphors we choose often reflect our intuitions about our content, our learners, and our role as instructors, and how the three fit together. By revealing our intuitions, metaphor moves our choices from

Reprise and Further Considerations

145

our subconscious to our conscious mind, enabling reflective decisions that capitalize on instincts and check our perceptions. Throughout the redesign process, we filter all of our choices through our selected metaphor to create a unified vision for our course. Remember the metaphor of The Little Inn at the Crossroads introduced in chapter 2? Bento (2000) used the inn to organize content about “leading with soul,” presuming this would be a spiritual journey for her learners. She presumed they would need space for self-reflection, discussion, and a library of materials to consult for inspiration. She saw herself as the innkeeper, the provider of the nurturing essentials, but not a principal player in the journey itself. Without the metaphor, it’s unlikely she would have had so clear a vision of how to create this course experience. Metaphors need not be forward facing. Learners need not know what mental models you are employing. They are primarily redesign tools and won’t necessarily be useful to your learners. Such was the case in one of my courses utilizing the baseball metaphor. It helped me to conceptualize content in terms of good, better, best outcomes and to think of myself as a coach, but in the end, my learners saw the course as an office simulation. Metaphor is the big picture view that drives redesign decisions, making them easier. As we’ve seen throughout the four-phases of course redesign, metaphor helps to focus our attention on the considerations most relevant to our particular situation. Without it, we might find the number of choices to consider overwhelming and the results chaotic. Metaphor helps bring order to the process, by providing a unifying framework for making decisions within each of the four sets of considerations set out by the DIME model.

HIGHLIGHTS OF THE DIME REDESIGN MODEL: A REPRISE The course redesign model sets out considerations in four phases beginning with those related to basic instructional design. This is the phase during which we decide what our students will learn. This set of foundational decisions is built upon in the second phase of course redesign. During the interaction phase, we decide how our students will learn. The media phase follows, where we choose the tools to accomplish the learning. Finally, in the evaluation phase, we figure out how we’ll know that the learning has actually occurred. Throughout our examination of the phases of course

146

REDESIGNING COURSES FOR ONLINE DELIVERY

redesign, we outlined considerations related to our content, our learners, and our instructional style. Next, we highlight some of those considerations and how they interconnect. Later we suggest additional matters to consider in implementing your redesigned course. We begin by revisiting decisions related to instructional design.

DeSIGN Considerations Online teaching and learning are well suited to a “learn by doing” approach. As previously discussed, passive methods that may work in the classroom don’t translate well to online learning. Taped lectures aren’t engaging enough; they fade into the background as learners attempt to multitask, something impossible to do unless one of the tasks is automatic—like driving a car or tying your shoes (see Chapter 4). To make learning active, we create objectives that are behavioral; in the end we see our learners acting out the skills or have them submit work that require those skills as evidence. For instance, if we want learners to “know” the axioms of a theory, we might set an objective that had learners labeling parts of a scenario where the axioms were visible. Setting Strong Course Objectives Making objectives observable is one of four criteria for strong learning objectives. The others are to make objectives measurable, attainable, and specific. Making objectives measurable involves determining in advance how we would measure the attainment of the objective. In so doing, we foreshadow evaluation decisions to be crystallized during phase four of course redesign. Making objectives attainable and specific involves considerations related to your content and your target audience (your learners). In Chapter 3 we explored factors of influence on objective setting such as the demographics, personalities, learning styles, past performances, and expectations of our target audience. We revisited metaphor to examine views held about our audience reflected there. For example, are learners ready to play, weave, dig, or collaborate with the content? Choosing Instructional Methods Learning objectives point to instructional methods by suggesting the information that will need to be shared with learners, the skills to be illustrated, ways for learners to practice those skills, and means by which to nurture

Reprise and Further Considerations

147

the learning progress. During this first phase, we make strategic level decisions about instructional methods that interconnect with considerations related to interaction, media, and evaluation to become specific enough to implement. We underscore this process in our review of interaction.

Interaction Considerations There are three types of learner-centered interactions to consider in online teaching and learning. There are learner content interactions, learner instructor interactions, and learner learner interactions. Interaction with content is what makes the course a learning experience (Moore, 1989). Interaction with the instructor is what makes this a teaching experience. All three types of interaction need not be present; some types of interaction will be privileged over others. Decisions about interaction give the course its unique character and most influence the way learners and instructors experience the course. Learner Content Interactions Content interactions relate primarily to the layout and structure of the course. They relate most closely to the instructional methods of sharing information, illustrating skills, and guiding practice. Learners can engage these methods in isolation, creating an independent learning experience. This works best for content in which learners are highly interested, tapping their intrinsic motivation to move through the course. Without such interest, learners are far more likely to engage content at a surface level, employing strategies to reproduce information well enough to pass a test, but gaining little meaning from it (Lawless & Richardson, 2002; Richardson, 2003). In a situation such as this, it is best for learner interaction with content to be facilitated by the instructor. Learner Instructor Interactions Instructor interactions fall into three categories: administrative, facilitative, and relational. Administrative interactions are focused on setting expectations. Online teaching and learning lacks the vicarious cues that communicate expectations available in traditional classrooms. In moving courses online, the implicit needs to become explicit. Learners need to know what it means to participate in the course, where to find materials, and what they need to accomplish and by when. Instructors communicate those expectations through instructions, scoring rubrics, and feedback. Administrative

148

REDESIGNING COURSES FOR ONLINE DELIVERY

interactions relate closely to the instructional methods of nurturing progress. Administrative interactions intersect with considerations during the evaluation phase of redesign. For instance, setting expectations for class participation by way of a scoring rubric indicates learner performance will be evaluated in relation to those expectations. Facilitative interactions involve the instructor in bringing about learner interactions with content and/or their peers. These interactions are essential for learners experiencing the course as intended. They are intertwined with all instructional methods as they will steer learners through information sharing and skills illustration and support their skills practice. Courses privileging content facilitation can be experienced as private lessons with learners depending upon the instructor for all course related information. This model works best for small, skills intensive courses where learners need concentrated feedback. It is not scalable without risking instructor workload concerns. We’ll revisit peer facilitation in the next section; considerations related to facilitating peer interactions depend upon the role peers will play in the learning process. Relational interaction is neither student nor instructor centered; both are principal players in determining the kind of relationship they will have. The quality of relationships will vary, as they are co-constructed between learner and instructor. Immediacy behaviors will be influential and have implications for the media phase of course redesign, as synchronous channels are better suited to them (Pelowski, Frissell, Cabral, & Yu, 2005).2 Learner Learner Interactions Peer interactions also depend upon a kind of immediacy called social presence.3 Considerations for how to facilitate social presence are essential if peer interaction is desired. Peers may be cast as community members, information resources, or collaborators. The more involved they are to be in the learning process, the more involved the instructor will need to be in facilitating peer connections. Peer interactions are most related to the instructional methods of guiding practice and nurturing progress. Courses privileging peer interaction are built around the principles of collaborative learning. This approach leverages the power of social learning. Past studies have shown that learners perform better on concept knowledge questions following discussion, even if none of the peers knew the answers at the start of the discussion (Smith et al., 2009). With the right media, feedback mechanisms and instructor facilitation, collaborative learning can be scalable to larger enrollment courses. It may violate learner expectations of working in isolation; however, so explicit expectations for

Reprise and Further Considerations

149

how the course will run and instructor monitoring are needed, particularly at the start of a course. Monitoring activities connect with both media and evaluation considerations. Role of Metaphor in Interaction Decisions Used as a lens for course redesign, metaphor helps to filter out extraneous considerations and focus attention on the choices most important in bringing about your particular course. Looking at your course through your chosen metaphor, the relevant design and interaction choices become obvious. In Chapter 4, we exemplified this process, examining a single course viewed through three different metaphors to demonstrate how decisions varied, resulting in plans for three distinct courses. Each course had a unique look and feel with elements shaped specifically to meet the content, learners, and instructor. In the third phase of redesign, those course plans were intersected with considerations related to media to determine how best to bring the courses to life.

Media Considerations Media are the channels that support interaction. Since interaction is the main activity in online teaching and learning, technology used to support it are labeled media in the DIME model. In order to choose course media, we must identify the interaction related course activities set out in the plan developed during phase two of redesign. We proposed a media selection taxonomy featuring 10 such activities to be carried out through technology. The action verbs used in objectives and instructional methods help us recognize the activities involved. The activities were organized around interaction type (learner, instructor, peer) and media characteristics. Media characteristics included one-way and two-way interactions. Two-way interactions are described as either synchronous or asynchronous. Activities and required media characteristics to enable them fit together to easily identify the specific media features (or functionality) needed to meet interaction needs. In Chapter 5, we explored sample tools that enable the 10 interactionrelated course activities, and considered their fit with our previous redesign decisions to select media. We matched the course activities depicted in the plans with specific technology tools to enable them. We demonstrated this process for two of the course plans developed in Chapter 4.

150

REDESIGNING COURSES FOR ONLINE DELIVERY

Role of Metaphor in Media Decisions Metaphor provides a means to conceptualize the organization of your course. It implicitly organizes the content into modules within which the course activities fit. From that organizing scheme, media needs can be determined and specific tools identified. Metaphor creates a visual depiction of how learners will encounter the course. By employing a graphical organizer, such as a storyboard, the implicit course organization becomes explicit. For instance, we applied the playground metaphor to related design and interaction decisions comprising our course plan. In so doing, we determined the implied organization was a five-module course layout with each of the modules standing independent from one another (like playground equipment). This is similar to the way playground equipment functions. We fitted each of the course activities from our course plan into one of five boxes to create a course mock-up. We applied the media taxonomy to each activity to generate a list of specific media needs. Metaphor again narrowed the scope of choices, brining into focus the layout and interaction needs to be filled by technology. Specific tools were discussed but the focus was on the media selection process and its role in bringing courses to life. The final phase of course redesign revisits choices made during all three of the previous stages to decide how best to assess student learning, instructor effectiveness, and course quality.

Evaluation Considerations Evaluation is the process by which we determine how things should go, how things are going, and how things went. We use formative and summative methods to answer these questions as they relate to student learning, instructor effectiveness, and course quality. Formative evaluation enables changes to be made along the way. Summative evaluation provides data about how things went that can help inform future activities. In terms of course redesign, we focused primarily on student learning. Things change when we move them online. Some of the changes are advantageous. Others are not. There is accessibility to data online that isn’t available in face-to-face classes. Learning management systems (LMS) generally maintain student activity logs that can provide information that allows for assessment of levels of engagement. For instance, charting what materials learners opened, when, and for how long. The content of class discussions is archived in online courses. The archives

Reprise and Further Considerations

151

allow for contributions to discussions to be analyzed and used as part of student performance data or as a means to assess general understanding. Unfortunately, online contexts also seem to feature increased opportunities for academic dishonesty. Exam security is especially trying. Articulating the rules is particularly important; learner perceptions about what constitutes cheating in online settings seem to vary (King, Guyette, & Piotrowski, 2009). Evaluation decisions begin in phase one of the redesign process. Summative evaluation choices are included in the way strong learning objectives are written. When we make objectives observable, measurable, attainable, and specific, we make evaluation-related decisions. Evaluation choices present specific media needs such as means to deliver online feedback or administer exams. In Chapter 6, we argue that evaluation is a value-laden process. In making decisions about what to evaluate and how, we determine what is important and what is not. Multifaceted approaches are recommended in assessing student learning, instructor effectiveness, and course quality. Transparency along with the use of valid and reliable methods increases rigor. The DIME model represents course redesign as a series of considerations separated into the categories of design, interaction, media, and evaluation, but there is overlap. Course redesign is an iterative process. Decisions made early in the process influence later choices. Later choices may indicate a need to revisit previous decisions. Metaphor provides an organizing scheme that narrows the scope of considerations, making decision making manageable. The choice of metaphor may be the most essential decision you make. Next we explore further considerations to contemplate as you prepare to implement your redesigned course.

FURTHER CONSIDERATIONS We turn now to other considerations related to moving courses on line. The matters we’ll discuss fall into two categories: career and course management. Career considerations are strategic level concerns related to course redesign that have potentially long-term implications. Course management considerations are tactical level suggestions aimed at making your course implementation a success. Suggestions offered in relation to concerns are for your consideration and are grounded in experience. We begin our discussion with considerations related to course management.

152

REDESIGNING COURSES FOR ONLINE DELIVERY

Course Management Considerations Place the Dots Really Close Together No matter how clear your course organizing scheme is to you, at some point your learners are going to get lost. It’s inevitable. For instance, during my first foray into course redesign, students would often submit assignments from the wrong lesson or submit more advanced work before completing assignments designed to support it. They seemed to simply click around the modules to accomplish some work any work. This left us all confused as I couldn’t figure out how they were moving through the course and they felt ill-prepared for the work they were doing. So, I placed the dots closer together by revising the course to have five modules, each with the same number of parts. I added audio instructions in addition to those provided in text. This provided a path and consistency that resulted in successful navigation through the course. In courses that are not organized in a linear fashion, it helps to develop materials to teach the course structure. A video introduction to the course that describes how things work and where information can be found is helpful. Weekly video shorts that guide learners to materials and activities for that week are a good follow-up. Descriptive labels applied consistently to content is also helpful. It also helps to develop a system for labeling things learners are expected to do at particular intervals (i.e., weekly). I use the labels: Look, Act, Due. Look indicates any materials students should review in support of learning. Look seems to cover the variety of channels information may be delivered through, for instance readings, videos, and slideshows. Act indicates activities learners should complete. These may be prep work such as completing surveys, posting a comment, reviewing peer work, or completing a simulation. Due indicates assignments to be completed for submission by a particular date. Keeping due dates consistent helps learners stay organized. For instance, all work due is to be submitted by 6:00 a.m. on Mondays. It’s easy for course content to become overwhelming to learners. They become lost in a sea of links (to articles, to videos, to assignments, etc.). If possible, limit learner accessibility to less pertinent content, releasing it only when needed. This is particularly important when learners are first getting oriented to the course. In the end, learners must connect the dots for themselves, but placing them as close together as possible makes things easier for learners, increasing their persistence and likelihood of success in your course. To ensure you own success, we next investigate ways to maintain realistic expectations.

Reprise and Further Considerations

153

Keep Expectations Realistic Much of the added time and effort involved in online teaching is due to increased student contact and individualized instruction (Cavanaugh, 2005). This is consistent with my experience and that of my faculty colleagues. Therefore, it is important to manage learner expectations about your availability. I continue to work on this, as I personally find it difficult to ignore learner messages posing questions for which they could easily looked up the answer or consult another source. Becoming an easy information source for learners will reinforce the behavior of coming to you first. Practically speaking, answering all learner messages will greatly increase the amount of time you spend on your online course. Pragmatically speaking, answering all of their questions may result in lost learning opportunities for your students. There’s value in learners figuring things out for themselves. Educational consultant and professor, Terry Doyle (2011), synthesizes the last 15 years of neuroscience research on learning this way: the one that does the work does the learning. Recently, I implemented a course policy designed to facilitate independent and collaborative inquiry among my students and to move them away from the default behavior of directing every question they had to me. Esteemed colleague and collaborator, Dr. Albert Ingram, introduced me to this policy. It’s called 3 before me.4 Essentially, when students have a question, they must consult at least three sources to find the answer before they can ask me. Students have become skilled at independent problem solving and the number of emails about administrative issues (e.g., where can I find it or when is this due) has dropped dramatically, leaving me more time for meaningful interactions with students. Some learners will be disengaged, and you won’t always know why. Online learners disengage for a variety of reasons. Learner disengagement is not always about the class. Online students tend to be older, so are subject to work-life balance issues in addition to a lack of readiness and other factors (Street, 2010). Early intervention on the part of the instructor may help as it allows for additional clarification of course expectations. It also provides the opportunity to inform learners about their options. Unfortunately, they sometimes decide to leave the course, as they realize they do not have the time to devote to it. Although early intervention may result in increased class attrition, it may ultimately help keep learners in the program. One final expectation to hold is that there will be hiccups that affect the course experience. Learners may respond to an assignment differently than expected, technology may not work as intended, everything may take

154

REDESIGNING COURSES FOR ONLINE DELIVERY

longer than you thought it would. This is especially true when you first redesign a course. If you expect the unexpected, you won’t be surprised when it happens and you can prepare your learners for that eventuality as well. One way to reduce the impact of these early hurdles is to beta test your courses. I often use a summer course to roll out a new redesign. Enrollments tend to be lower, I have fewer competing demands providing more time for resolving issues, and everyone seems to be more relaxed. When the rollout must happen during the regular semester, I try to negotiate a lower than normal enrollment cap or offer it exclusively to learners in a continuing education program as they tend to be more independent and more candid in the improvement feedback they provide. Now that we’ve looked at some tactical level concerns related to course redesign, let’s turn our attention to more strategic concerns. In next section, we explore the career considerations of intellectual property rights and tenure and promotion.

Career Considerations Intellectual Property Considerations There are two sets of intellectual property (IP) rights that should be considered in redesigning courses for online delivery. The first is, rights related to the IP you create and the second is use of materials that are the IP of others. Concerns about IP are not new, but the online environment increases their visibility. Online courses often cross national boundaries and IP laws vary. According to the U.S. Copyright Office, nations that are members of the Universal Copyright Convention should honor copyright of individuals from other countries with placement of the copyright symbol r, name of copyright holder, and date the work was produced, prominently displayed. We begin with concerns related to the use of materials for which others hold copyright. Using the IP of Others. In redesigning courses, any materials shared with learners will need to be your own original work or be used in one of the following ways: under a license agreement, with the permission of the originator of the work, or according to fair use laws. For instance, you cannot download files from the web and put them in your course without permission of the copyright holder. The same is true for adding images you did not create yourself into your slideshows. Most of the material on the Internet is copyrighted, unless the owner specifically

Reprise and Further Considerations

155

places it in the public domain. (U.S. law does not require work to be marked as copyrighted; tangible works are automatically the IP of their creator.)5 There are institutional as well as individual responsibilities assigned under copyright laws. Most universities have support services to assist with copyright rules and permissions. Consult your university library about the availability of such services on your campus. Protecting Your Own IP. Generally speaking, academics maintain ownership of the works they create unless they specifically relinquish copyright, in writing, to another party. For instance, in publishing a book, authors frequently give copyright to the publishing house. With the transfer of copyright go the rights to control future use of the work. In the regular execution of their duties, professors maintain control of their course syllabi, class notes and slides, and any other materials they produce. Typical “work for hire” laws have not been applied to professors to protect academic freedom and the open exchange of ideas (see Springer, 2004, for a discussion). With the widespread use of Learning Management Systems and online courses, universities have significant pieces of faculty IP stored on their servers. This has raised questions about whether the rights to the materials belong to the faculty member or the institution. Institutions seem to vary in their policies related to this question. An interpretation of the issues and laws are outside the scope of this discussion (see Twigg, 2006, for more). The focus of our discussion is on raising awareness that copyright of online courses is in question. If you are concerned about controlling the future use of your work, you may want to investigate the rules at your institution. You might also consider establishing copyright explicitly, to ensure you and the university are in agreement about how materials can and cannot be used, by whom, and for what purpose. In the final section, we explore a key concern for professors still in pursuit of tenure and/or promotion. Time and effort considerations may be relevant to contract faculty as well.

Promotion and Tenure Considerations Quality online courses take time to produce and they take time to teach. A strong perception exists that they take more time than traditional courses. According to an Association of Public and Land-Grant Universities survey of faculty, 64% of the 10,700 respondents reported online classes take “somewhat more” or “a lot more” effort than traditional classes

156

REDESIGNING COURSES FOR ONLINE DELIVERY

(McCarthy, 2009). Furthermore, for those seeking tenure and promotion, that added effort is not necessarily a valued activity when it comes to making those decisions. Schell (2004) surveyed 232 tenure track faculty members at higher education institutions in the United States to explore the perceived value attached to the development of online courses. Development of online materials had little to no influence on the success or failure of simulated promotion files. You’ll want to consider the potential career impact in your individual situation. Redesigning courses for online delivery may take time away from the pursuit of activities deemed more valuable in promotion decisions. Those who wish to engage in course redesign may want to negotiate their efforts to be considered part of their service load. Another strategy to consider is to engage in research as part of your redesign efforts. There are a number of academic journals dedicated to this type of research in which to publish. The Center for Online Learning, Research, and Service at the University of Illinois-Springfield has a list of relevant journals available on their website.6

CONCLUSION The DIME model of course redesign is all about possibilities. The four sets of considerations proposed in the model helped us to consider ways to develop a quality experience while keeping workloads manageable for our learners and ourselves. Metaphors provide a framework for reflection and decision making. New media are constantly introduced ensuring an ongoing need for reflective course development. The concepts we have reviewed here have stood the test of time and changing technology. As new ideas in education and technology shape our environments, the DIME and metaphors should be tools you can turn to again and again to chart your path (and redesign your courses!) Course redesign is an ongoing journey that promises to engage and inspire all who accompany you on it. Our time together is drawing to a close, but in the words of J.R.R. Tolkien, “End? No, the journey doesn’t end here.”

NOTES 1. We defined contextual relativism in Chapter 1 as attending to the setting or context within which undergraduate student intellectual development occurs (Perry, 1999).

Reprise and Further Considerations

157

2. In Chapter 4, we discussed immediacy behaviors that help to communicate the presence of congruence, empathy, and positive regard. All necessary components for building relationships in professional contexts (Pitt, Wheeless, & Allen, 2004; Rogers, 1962). 3. Social presence is the perception of psychological closeness (Swan & Shih, 2005). 4. Dr. Albert Ingram is associate professor of Lifespan Development & Educational Sciences at Kent State University in Ohio, U.S.A. 5. For a discussion on public domain and links to sources of materials placed in the public domain, consult the ed tech teacher online newsletter at http://edtech teacher.org/index.php/teaching-technology/research-writing/63-public-images. 6. List of journals available at http://www.uis.edu/colrs/research/journal/

REFERENCES Bento, R. (2000). The little inn at the crossroads: A spiritual approach to the design of a leadership course. The Journal of Management Education, 24(5), 650 661. Cavanaugh, J. (2005). Teaching online: A time comparison. Online Journal of Distance Learning Administration, 8(1). Retrieved from https://www.westga.edu/~distance/ojdla/ spring81/cavanaugh81.htm. Doyle, T. (2011). Learner-centered teaching: Putting the research into practice. Sterling, VA: Stylus. King, C. G., Guyette, R. W., & Piotrowski, C. (2009). Online exams and cheating: An empirical analysis of business students’ views. The Journal of Educators, 6(1), Retrieved from http://www.thejeo.com/Archives/Volume6Number1/Kingetalpaper.pdf. Accessed on April 8, 2013. Lawless, C. J., & Richardson, J. T. E. (2002). Approaches to studying and perceptions of academic quality in distance education. Higher Education, 44, 257 282. McCarthy, S. A. (2009). Online learning as a strategic asset. APLU-Sloan National Commission on Online Learning. Retrieved from http://sloanconsortium.org/publications/ survey/APLU_Reports. Accessed on July 24, 2013. Moore, M. G. (1989). Three types of interaction. In M. G. Moore & G. C. Clark (Eds.), Readings in principles of distance education (pp. 100 105). University Park, PA: American Center for the Study of Distance Education. Pelowski, S., Frissell, L., Cabral, K., & Yu, T. (2005). So far but yet so close: Student chat room immediacy, learning, and performance in an online course. Journal of Interactive Learning Research, 16, 395 407. Perry, W. G. (1999). Forms of ethical and intellectual development in the college years: A scheme. San Francisco, CA: Wiley. Pitt, P. L., Wheeless, L. R., & Allen, M. (2004). A meta-analytical review of the relationship between teacher immediacy and student learning. Communication Monographs, 71(2), 184 207. Richardson, J. T. E. (2003). Approaches to studying and perceptions of academic quality in a short web-based course. British Journal of Educational Technology, 34, 433 442. Rogers, C. (1962). The interpersonal relationship: The core of guidance. Harvard Educational Review, 32(4), 416 429.

158

REDESIGNING COURSES FOR ONLINE DELIVERY

Schell, G. P. (2004). Universities marginalize online courses. Communications of the ACM, 47(7), 53 56. Smith, M., Wood, W., Adams, W., Wieman, C., Knight, J., Guild, N., & Su, T. (2009). Why peer discussion improves student performance on in-class concept questions. Science, 323(5910), 122 124. Springer, A. (2004, November). Intellectual property issues for faculty. American Association of University Professors. Retrieved from http://www.aaup.org/issues/copyright-distanceeducation-intellectual-property/resources-copyright-distance-education-and/intellectualproperty-issues-faculty. Street, H. (2010). Factors influencing a learner’s decision to drop-out or persist in higher education distance learning. Online Journal of Distance Learning Administration, 13(4). Retrieved from http://www.westga.edu/∼distance/ojdla/winter134/street134.html Swan, K., & Shih, L. F. (2005). The nature and development of social presence in online discussions. Journal of Asynchronous Learning Networks, 9(3), 115 136. Twigg, C. A. (2006). Who owns online courses and course materials? Center for Academic Transformation. Retrieved from http://www.thencat.org/Monographs/Whoowns.html. Accessed on March 15, 2013. U.S. Copyright Office. Retrieved from http://www.copyright.gov/ fls/fl100.html. Accessed on July 19, 2013.

ABOUT THE AUTHOR Robyn E. Parker, Ph.D., is a tenured, full professor in the College of Business Administration at Plymouth State University in New Hampshire, USA. She earned her doctorate in organizational communication from Wayne State University and master’s degree in human resource development from Boston University. She has been teaching in higher education since 1990 and online since 2005. A pragmatic scholar, she has dedicated her career to studying the effects of new communication technologies both in the classroom and in work organizations. Her expertise is in conducting needs assessments, communication audits, and designing interventions to improve organizational performance. A seasoned trainer and instructional designer, she is committed to making eLearning more effective in both university and corporate contexts. She has received grant funding in support of her work from both FIPSE and the Ohio Learning Network. Her research on this subject has most recently been published in The Research Journal of the Center for Educational Technology and in the journal, Education, Technology, & Society. Passionate about faculty development, she first created the DIME model as a framework for her workshops in course redesign.

159