Postscript to “Designing and Assessing Hybrid Cinema Studies Courses”: Redefining “Hybrid,” Modular Design, and Student Evaluations as Research
Skip other details (including permanent urls, DOI, citation information)
Because I had previously taught hybrid and fully online courses, my colleagues deemed me lucky when University of Washington shifted to online instruction at the end of last winter. While my facility with the Canvas Learning Management System and other online platforms freed me from learning new technologies overnight, it did not prepare me for crisis pedagogy. My prior online course enrolled students who had opted for online instruction, not students conscripted into online courses due to COVID-19. The course design also assumed baseline hardware, software, and internet bandwidth requirements established in consultation with University of Washington’s Continuum College, the unit then overseeing online instruction. I could not suppose similar infrastructure access from students who may have relied on campus computing labs, friends’ laptops, and coffeehouse Wi-Fi. I could not expect my students to reside in the same time zone or countries where they could discuss controversial topics without fear for their safety. While I could survey students to troubleshoot time zone and technology access issues, I had no training to address their increased anxieties around money, health, and family, anxieties that affect cognitive processing as well as physical and mental health. Nor did I have experience making near-instantaneous—yet considered—changes to assignment requirements and grading policies late in the quarter to support Black and other students whose ability to sit through Zoom session after Zoom session further eroded as they juggled academics with nightly protesting. Like my colleague Louisa Mackenzie, I found myself teaching while grieving a mother-in-law who died the third week of the quarter. To reiterate, online teaching experience did not prepare me for online crisis pedagogy. However, it did leave me versed in research on evidence-based online teaching and allowed me to pilot and assess different materials and assignments. In what follows, I’ll return to issues raised in my 2015 Cinema Journal Teaching Dossier article, describing how and why I altered my course design in the shift to fully online instruction and how my institution used student evaluations of online courses as research.
Why Hybrid Pedagogy?—Revisited
Since 2014, my cinema studies courses have featured two to three weeks of hybrid instruction to accommodate attendance of the Seattle International Film Festival (SIFF) and effectively incorporate festival films into course assignments. “Hybrid” took on a new meaning when University of Washington instituted remote teaching. While all instruction occurred online, my Spring 2020 film course blended asynchronous activities with one synchronous Zoom meeting per week. I chose a hybrid format despite the bandwidth live videoconferencing requires, especially when students simultaneously view streaming media. I did so in part because EDUCAUSE’s 2019 Study of Undergraduate Students and Information Technology reports a 70% preference for face-to-face courses. Study subjects represent the students we typically encounter in face-to-face courses who would now learn remotely. My university supported online synchronous instruction via a Zoom Pro license and equipment loans to students without computers. Moreover, a survey I distributed early in the quarter indicated that enrolled students’ computers and Wi-Fi could handle Zoom’s system requirements, although some had to cut their video feeds to maintain connection stability. In their survey responses, students also remarked on the efficacy of live meetings for developing rapport and framing critical responses to peers with respectful body language and vocal tone. One student commented that live meetings made him “feel like part of a group,” and others suggested ways to configure synchronous small-group discussions to strengthen group ties and ensure students met as many of their peers as possible.
Students’ commentary on synchronous discussion’s affordances mirrors research in online learning. Stefan Hrastinski observes the “exchange of social support” and “psychological motivation” to contribute characteristic of synchronous discussions. However, he also argues for blending synchronous learning with asynchronous activities, as each serves different purposes, thus complementing one another. I’ve done so since the mid-1990s, when I began assigning online responses for students to complete and review before face-to-face class meetings.
In my Spring 2020 online cinema studies course, I expanded my slate of asynchronous activities, increasing cognitive difficulty over the week in accordance with Bloom’s taxonomy. Early in the week, students took timed, open-book quizzes that assessed comprehension of textbook and lecture concepts and asked them to begin applying those concepts to screenshots or clips from course films. I developed question banks and shuffled answer choices to ensure each student received a different version of the quiz. I also built in automatic feedback that explained wrong answers immediately after students submitted the quiz. In the two days before our live meeting, students continued to apply textbook and lecture concepts to course films, submitting a discussion posting and replying to two peers. While I required at least 250-300 words for initial postings and 75-100 words for replies, students frequently exceeded the minimum, offering insightful, well-supported analyses and thoughtful replies to peers. Lynette Watts’ 2016 review of research on synchronous and asynchronous discussion observes that “asynchronous interactions allow students to take time to consider their thoughts [and] engage with the content more deeply,” findings my students’ work affirms (see Figure 1 for a sample discussion board and Figure 2 for a student posting). Although my asynchronous discussion assignment uses the “choose a question, post once, reply twice” format critiqued by Shannon Riggs and Katherine Linder, I avoid the problem of repetitive responses by posing a breadth of questions, including a free response option, and limiting the number of students who can address each question. I’m still considering whether my format escapes the trap Riggs and Linder compellingly describe: being instructor-centered rather than student-focused.
Each week ended with synchronous discussion in which students built on asynchronous exchanges, further analyzing course films and assessing scholarly articles. These sessions featured a mix of generative writing followed by whole-class discussion (both spoken and typed into Zoom chat) and small-group exchanges in Zoom breakout rooms, with groups taking notes on a shared Google Doc and reporting out highlights. I ended with a short, interactive lecture introducing the following week’s film and outlining elements for students to focus on while viewing. Ideally, I would have recorded my lectures, breaking them into short videos interspersed with polls and short writing prompts. I also would have liked to integrate live polling, word cloud generation, and topic proposing/upvoting into synchronous discussions, something I do with PollEverywhere during in-person courses. Time constraints and a decision to limit course technology to four tools—Canvas, Google Docs, VideoANT, and Zoom—prevented me from doing so.
I divided my quarter-long course into weekly modules to build clarity and consistency. Students could locate all materials for the week within the module. Modules also linked to readings, streaming media, live Zoom meetings, and assignment submission and discussion areas. I offered multiple ways for students to communicate with me, linking to my drop-in hours and a weekly check-in space where students could pose questions or answer a fellow student’s question if I hadn’t yet replied. Students who wanted to delve more deeply into the week’s topics could explore optional links collected at the end of the modules. Each module featured an overview that introduced the week’s topic, stated learning goals, listed readings and assignments, presented an agenda for our live class meeting, and provided a sequence for completing the week’s tasks (see Figure 3 for a module and Figure 4 for a module overview).
Because online learning was a new experience for most of my students, I held two short introductory live meetings to tour the module structure and troubleshoot with students as they tested Zoom features and other platforms they would use throughout the quarter.
Student Evaluation as Research
In my previous Cinema Journal Teaching Dossier article, I shared a student survey I used to gather feedback and make changes to online instruction during—rather than after—the course. I similarly surveyed students mid-quarter in Spring 2020, but I did not have to design my own questions. The University of Washington’s Office of Educational Assessment developed new midterm and end-of-term evaluation forms for remote courses. The midterm form collected numerical data on students’ ability to engage course concepts and keep up with assignments as well as the quality of instructor communication and responsiveness. In addition, the survey included open-ended questions on elements of the course that helped and hindered students’ learning. I used midterm feedback to negotiate new assignment deadlines, altering due times from noon to 11:59 p.m. I also clarified confusion regarding extra credit and added resources to our “Start Here” module to help students who reported difficulty in adjusting to online learning in all their courses.
Although the new midterm and end-of-term evaluation forms helped individual instructors like me assess the effectiveness of their online courses, the university undertook a large-scale analysis of evaluation results with the goal of improving online teaching across our campuses. University of Washington Information Technology and the Office of Educational Assessment analyzed student responses to all open-ended midterm and end-of-term evaluation questions to identify prevalent themes and assess whether those themes changed from the middle to the end of the quarter (spoiler: they did not). The depth of findings makes summarizing the full research report impossible in this postscript. Instead, I’ll focus on key takeaways that both complement and supplement students’ feedback on my cinema studies course.
Students report that “increasing efficiency in remote learning”—for example, “keeping all course content in a single location” and recording live lectures—can “reduce extra stressors.” They desire clear, direct communication from instructors regarding assignments, grading, and expectations. They also want to interact with peers on discussion boards, through group projects, and in breakout rooms interspersed throughout well-organized live lectures. Moreover, students emphasized their increased need for examples—particularly real-world examples—in online courses. I’ve considered these findings while designing and teaching my current course on stories within and across media. I’ve maintained the synchronous/asynchronous blend and modular design described above, but I’ve added a live weekly “café” to increase interaction. Attendance is optional, and students may discuss whatever they wish. To my surprise, this informal café has never stayed empty. In addition to giving students another opportunity to connect with one another, I’ve attempted to increase examples, whether providing sample student work along with project assignment prompts, sharing my screen to display examples while speaking, or asking students to illustrate discussion points with examples. As we face another term of online teaching, I remember my most important evaluation takeaway: the need to remain understanding and flexible as we all navigate continuing health and social pandemics during a time of political transition.
Kimberlee Gillis-Bridges is a Senior Lecturer in English at the University of Washington, where she also directs the Computer-Integrated Courses Program. Since the mid-1990s, she has experimented with educational technologies and smart classrooms in her film, literature, cultural studies, and writing courses. In addition to teaching, she gives frequent, invited presentations on technology-integrated pedagogy and the scholarship of teaching and learning. Her writing has appeared in More Ways to Handle the Paper Load–On Paper and Online (NCTE Press, 2005) and The Bedford Bibliography of Basic Writing (Bedford/St. Martin’s, 2004).
Louisa Mackenzie, “The Teachers Are Not Okay: A Plea for Trauma-Informed Administration in Higher Education,” Medium, September 21, 2020, https://medium.com/@louisamackenzie/the-teachers-are-not-okay-a-plea-for-a-trauma-informed-administration-in-higher-education-99f9b3b37eb2. ↑
Dana C. Gierdowski, 2019 Study of Undergraduate Students and Information Technology, EDUCAUSE, October 30, 2019, https://library.educause.edu/resources/2019/10/2019-study-of-undergraduate-students-and-information-technology. ↑
Stefan Hrastinski, “Asynchronous and Synchronous E-Learning,” EDUCAUSE Review, November 17, 2008, https://er.educause.edu/articles/2008/11/asynchronous-and-synchronous-elearning. ↑
Patricia Armstrong, “Bloom's Taxonomy,” Center for Teaching, Vanderbilt University, 2010, https://cft.vanderbilt.edu/guides-sub-pages/blooms-taxonomy/. ↑
Lynn Watts, “Synchronous and Asynchronous Communication in Distance Learning: A Review of the Literature,” Quarterly Review of Distance Education, 17, no. 1 (2016): 23–32. ↑
Shannon A. Riggs and Kathryn E. Linder, “Actively Engaging Students in Asynchronous Online Classes,” IDEA Paper, no. 64 (December 2016): 1-10, https://www.ideaedu.org/Portals/0/Uploads/Documents/IDEA%20Papers/IDEA%20Papers/PaperIDEA_64.pdf. ↑
Henry Lyle, Peter Seibel, Florencia Marcaccio, and Angela Davis-Unger, “Analysis of Course Evaluations, 2020,” PowerPoint Slides, University of Washington, https://itconnect.uw.edu/wp-content/uploads/2020/09/Analysis-of-course-evals-2020.pdf. ↑
Lyle, Seibel, Maraccio, and Davis-Unger, “Analysis of Course Evaluations.” ↑