8 A Versatile Interactive Focus Group Protocol for Qualitative Assessments
Skip other details (including permanent urls, DOI, citation information)
This work is licensed under a Creative Commons Attribution-NonCommercial-NoDerivatives 4.0 International License. Please contact : [email protected] to use this work in a way not covered by the license.
For more information, read Michigan Publishing's access and usage policy.
A highly flexible focus group protocol captures efficiently and economically useful data for immediate and longitudinal course and program assessment. Special features include an index card activity that deals with satisfactions levels and a Roundtable/Ranking activity that allows participant-generated judgments about the most positive and the most negative features of a course or program. These latter activities, with data displayed in an Excel histogram and in a colored-coded Word table, can be used for what is called a “Quick Course Diagnosis” (QCD).
Assessment is becoming increasingly important on most campuses: The U.S. Air Force Academy (USAFA) is no exception. In fact, since its highly successful North Central Association accreditation review and the appointment of a full-time director for academic assessment, USAFA has been increasingly recognized as a leader in the assessment field. For the past several years, the Center for Educational Excellence (CEE) has been increasingly involved in helping departments gather and interpret both quantitative and qualitative data.
To meet the need for qualitative data, we have developed a unique process—highly structured interactive focus groups—to get information and insights from those most directly affected by curriculum and pedagogical transformations: the students. The use of focus groups in academia is not a new practice. Wright and Hendershott (1992), for example, find focus groups to be an invaluable means of tapping student perceptions within a university setting. They feel that interactions among students, particularly when the facilitators are able to probe responses, yield rich data that cannot be obtained from traditional surveys.
However, the focus group protocol developed and refined at USAFA is unique because of its highly structured approaches to data collection and because of the power of the resulting reports. These focus groups employ a variety of techniques to maximize data collection from as many students as possible. We use a survey (optional); an index card activity; a cooperative, group-based activity called Roundtable/Ranking; and a series of open-ended questions. Faculty members interested in personal information about their individual courses have welcomed this focus group protocol. Focus groups have also been widely used for curriculum review and reform at the department level by course directors responsible for preparing the syllabus, ordering texts, and coordinating courses taught by more than one instructor. They have also been used to assess the impact and value of a major and to compare experimental courses with those traditionally taught. Additionally, they have been used to evaluate special programs such as a new one-week summer engineering course for high school students. To simplify terminology, we will refer to the contact persons, whether faculty members, course directors, or department assessment representatives, as “clients.”
The focus group protocol is typically used with student volunteers who come to a neutral location to share ideas and experiences about their courses or about their programs of study. To encourage representative samples, instructors teaching each section of a course will draw the names of two students and ask them to represent the class at a focus group scheduled (usually) in the early evening. It is difficult for them to say no, unless they have a genuine conflict. But, in some key cases, the same focus group protocol has worked effectively with a complete population sample that includes all students enrolled in a given course (up to 18). To assess, for example, the effectiveness of a new core law course, all students enrolled in the three pilot sections were interviewed by a CEE representative and a member of the law department who knew the course content, but who was not teaching this course at the time.
These focused student interviews seek to gather as much data as possible within a 50-minute time limit and to gather it systematically so that long-term comparisons are possible as courses or programs are modified.
THE CLIENT INTERACTIONS
All focus groups begin with a client-centered discussion to explore the following: 1) the objectives for the focus group, 2) questions and activities that will likely yield valuable data, and 3) the “logistics” involved, including selection of the students, the room layout, and any refreshments (pizza works well). The CEE uses a paired approach with one staff member responsible for conducting the actual session (the facilitator) and another staff member responsible for logistics, including the audiotaping arrangements and the preparation of the final reports. Typically, the facilitator conducts the client interview, and based on the identified needs, prepares a survey instrument, if desired, and PowerPoint slides with the course-specific open-ended questions. During the actual session, the facilitator takes the more active role, but the logistics person, who is introduced as an equal partner, is also free to ask probing questions as the discussion unfolds. Likewise, when the third step occurs—a feedback interview with the client(s)—both CEE parties contribute ideas and impressions. A fourth step is highly desirable, but not always possible, depending on the timing of the focus group: a feedback session between the client and the actual students interviewed or the students they represented.
THE STUDENT SURVEY
A short student survey is an optional way to gather additional information without increased time involvement because it is usually completed by early arrivals before the start of the actual focus group session. The students, after being welcomed, receive the survey as they arrive for the focus group. This practice reinforces the seriousness of the project and helps students feel at ease when they are given an initial task to perform. The questions on the survey (Appendix 8.1) are often idiosyncratic, such as asking students how much time they devote to the course.
Instructions for Students
After the expected students arrive, the facilitator explains the nature of the session, including the ground rules for discussion. Students also understand the confidentiality of the responses and the use to be made of them. The purpose of the focus group session is to gather information for constructive changes; the facilitator informs students that they will receive feedback from the instructor if the logistics allow for it. They also learn that, to provide a complete record of their comments, the session will be audiotaped. Only a community-based transcriber will hear the tape with their voices. To further ensure confidentiality, students use a number they acquire through a countoff as identification, rather than their names. They preface all remarks with their assigned number. Students quickly adopt the practice of saying things such as, “This is Student Number Six: I disagree somewhat with Number Five’s comment because his experience is so limited, but I think that Number Four and Number 17 really clarified the issues.”
THE INITIAL FOCUS GROUP ACTIVTY USING INDEX CARDS
After the students complete the survey, if one is included, and hear the opening information, they are handed an index card. Working independently, they jot down on the card a word or phrase to describe the course or program and a number from one to five to indicate their satisfaction level. Usually the facilitator has the students indicate, round-robin fashion, their responses. These range from numbers and comments such as, “I gave it a five and an ‘Awesome’” to “I gave it a number one. My comment [for a geography course] was ‘As dry as eating saltines in the Sahara.’ ” This pubic disclosure serves a number of functions: 1) It enables students to feel comfortable with their peers because everyone has been up-front about their overall feelings; 2) facilitators also learn quickly whether they are faced with happy or unhappy campers, knowledge that can help frame probing questions; and 3) an atmosphere of trust and open disclosure is established up-front before any of the open-ended questions—often more sensitive—are asked.
THE OPEN-ENDED QUESTIONS
Two types of general questions then follow, some that everyone responds to (round-robin) and some where anyone may answer. The facilitator and the client discuss beforehand which questions all students should answer and which ones are suitable for random responses. Given the time constraints, no more than two questions should be round-robin style. These open-ended questions (Appendix 8.2) typically target issues of particular concern to clients. For example, an English professor wanted to understand the impact of one-on-one grading conferences that replaced traditionally marked papers. A political scientist was concerned about low course critique ratings and wanted feedback from students that could help her strengthen her teaching methods. The law department was interested in the value of a new textbook in a core course. The management department was concerned about the quality of advising that their majors received. The physics and math departments wanted to know the value of preflights, a series of web-based questions that students complete prior to each lesson. A course director in the computer science department wanted to assess the impact on learning and attitude of students programming Lego robots: multiple focus groups targeted the Lego courses (experimental) and traditionally taught courses (control).
A STRUCTURED GROUP ACTIVITY: ROUNDTABLE/RANKING
A group-based assessment activity called Roundtable/Ranking allows students to identify and clarify their own issues. Unlike the volunteer open-ended questions where some students may remain silent, this highly structured activity ensures equitable contributions from all participants. The facilitator places the students into small groups, and gives each group a handout (Appendix 8.3) with specific instructions to brainstorm all the strengths of the course or program and then to brainstorm all the things about the course that could be improved. These two activities follow rapidly, one after the other, so the students don’t get into an analytical mode. The paper circulates rapidly from one student to another as each adds an idea, saying it aloud. Then, each group rankorders the strengths of the course and then the weaknesses. The rank-ordering activity is critical because it enables students to reach consensus on their priorities and to eliminate any idiosyncratic responses. The brainstorming and ranking parts of Roundtable/Ranking together take only ten minutes out of the 50 minutes available.
The focus group session typically concludes with more open-ended questions. Depending on their complexity, we usually offer eight to ten questions for open discussion. Before leaving, the two CEE staff members make a point of thanking students for their contributions.
THE CLIENT REPORTS
All clients receive a neat, comb-bound summary report with colorful graphics. If we conducted focus groups for multiple sections of a course or program, often a single report captures combined data. Any survey results are assembled and displayed appropriately. The index card data are presented through a colorful Excel histogram (Appendix 8.4). Each column’s height corresponds to the number of students selecting that rating. For example, if ten students indicated a satisfaction level of “four” for a course, that column would be twice as high as the one representing the five students who gave it a “two.” Listed within each column are the descriptive words or phrases (e.g., “Stimulating,” “Awesome,” “A Real Challenge”). The histogram provides an easily interpreted overview of students’ satisfaction levels. As improvements occur over time, clients can expect to see the heights of the “four” and “five” columns rise accordingly, thus providing longitudinal assessment data. In fact, CEE also prepares special reports summarizing longitudinal data, especially in anticipation of accreditation visits.
The responses to the open-ended questions are word-processed by a medical transcription service. This service, which charges about $75 per transcription, emails the final results within 24 hours so that CEE staff members can review and possibly edit them. Editing usually involves correcting the spelling of unfamiliar Air Force terms or locations (“TOY” or “Misawa Air Force Base”). The transcripts identify each student by their number and provide an in-depth record of all comments. A typical entry might look like this: “Student Seven: Unlike Student Two, I felt the evaluation methods were extremely fair because the essays gave us a chance to demonstrate what we actually knew. I found myself integrating ideas in ways I hadn’t thought of before.” After several years of stressed-out secretaries and untimely reports, we now budget yearly for these invaluable professional transcription services.
The Roundtable/Ranking data are displayed in a color-coded Word table (Appendix 8.5). All the strengths and weakness appear under each team number. A CEE assessment expert identifies trends through a quick cluster analysis and color-codes them systematically. For example, all the strengths and weaknesses listed by any team that relate to teaching methods might be coded in yellow. Evaluation items might be coded in green. Because these colors remain consistent, the tables, like the histograms, make interpretations and changes over time easy to spot.
THE CLIENT/CEE FEEDBACK SESSION
Rather than simply hand off a report for each focus group, CEE staff members prefer to meet with an individual client or sets of clients, such as all those teaching a core course. That way we can be certain that the data are understood and properly interpreted. We also like to explore ways of strengthening a course, program, or major. The data provided in these three reports—the Excel histogram reflecting the index card information, the transcript of the open-ended question responses, and the Word table reflecting trends evident during the Roundtable/Ranking activity—provide a solid basis for informed, research-based decision-making.
QUICK COURSE DIAGNOSIS
Because these focus groups proved so valuable to departments and individual faculty members, the demand for them skyrocketed. With a limited staff, this demand was difficult to meet. Thus, we began offering an abbreviated version of the focus group protocol, known as a Quick Course Diagnosis (QCD). This innovative practice received recognition at the 2001 POD conference as a “Bright Idea.” For a QCD, one, rather than two, facilitators comes to a class and administers only the index card activity and the Roundtable/Ranking activity. We eliminate the round-robin sharing of the index card data typically done during the focus group protocol. Time is saved also because students do not need assigned numbers to assure their anonymity because there are no open-ended questions to audiotape. Thus, the entire process can be completed in 15 minutes. To augment this data, we now spend another five minutes reaching a whole-class consensus on the top three strengths of the course and the top three “weaknesses” (things to improve). The consensus is reached quickly because each Roundtable/Ranking team indicates their top strength and their top weakness, which the facilitator records rapidly on the chalkboard. If there are duplications, the reporting team adds its second choice. Depending on the number of teams, there will be approximately five items in each category. Each student then votes on their top choice for a strength and a top choice for a weakness. Ties are quickly broken by additional votes so that the top three strengths and the top three weaknesses are clearly identified. A student copies everything from the board while the votes are tallied, thus giving CEE a permanent record.
As with the focus groups, a faculty development assessment specialist prepares client reports. Clients requesting a QCD receive: I) an Excel histogram with the index card information indicating the satisfaction level with the course/program/major and the words or phrases describing it; 2) a Word table reflecting the data generated by each team, color-coded to reflect trends; 3) and a second Word table based on the whole-class consensus, similarly color-coded co emphasize trends. These reports can be generated very rapidly—five to 15 minutes each—because we use templates and because we do so many focus groups and QCDs (typically over 30 a semester) that the assessment specialist becomes adept at preparing them. Although QCDs are not as rich in data as the focus groups because of the omission of open-ended questions, the three reports prove extremely valuable. This trade-off results in less class time, fewer facilitators, and decreased coses because there is no need for the transcription service.
Differences Between a QCD and an SGID
Many faculty developers are familiar with a widely used assessment tool called Small Group Instructional Diagnosis (SGID). It is based on research conducted by Joseph Clark (Clark & Redmond, 1982) when he served as a FIPSE (Fund for the Improvement of Postsecondary Education) project director at the University of Washington, Seattle. Practitioners such as Diamond (2002) and Wulff (1996) agree on the basic steps involved.
The SGID session is framed by a pre- and post-client interview. During the 30-minute in-class interview with students—usually conducted at the midterm point—the facilitator introduces himself or herself, explains the SGID process, and asks students to form groups of six to eight and select a recorder. The students then discuss the questions on the SGID feedback form with the student recorder writing down the points on which they reach consensus. The form prompts them to identify the strengths of the course backed up by concrete examples and the “weaknesses” of the course, backed up by specific suggestions for constructive changes. After this process, which typically takes only eight to 12 minutes, the facilitator records—or asks a student to record—the comments of each group on a central chalkboard. Another student recorder copies everything from the board for later analysis. Facilitators have several key tasks. They ask students for clarification or amplification on ambiguous points, and they seek to determine whether there is whole-class consensus on the issues raised. This can be accomplished by asking for a show of hands indicating agreement or disagreement with particular comments.
To prepare the client report, the SGID facilitator analyses and organizes the material to make it meaningful to the instructor. The comments can be arranged, for example, in order of frequency under the central headings of “Things to Continue,” “Things to Consider Changing,” and “Suggestions.” The facilitator tries to “chunk” data under common themes to be shared with the instructor through a carefully crafted letter. Not all comments are included verbatim, particularly those with potentially hurtful phrasing. But representative comments can give the flavor of the SGID experience.
Both QCDs and SGIDs provide useful options to offer faculty clients. As should be evident, the QCD protocol is indebted to the inventors and practitioners of the SGID. However, there are some significant differences that faculty developers should weigh before recommending a specific approach. The QCD has four distinct advantages over the SGID. First, the index card activity with its resulting histogram is a unique and highly effective assessment activity, allowing longitudinal comparisons to track improvements. Second, because of its highly structured nature, the QCD requires far less in-class time to administer than an SGID. Even with the discussions that arise during the whole-class consensus period, virtually any experienced facilitator can be in and out the door in 20 minutes. Although the SGID protocol presumably takes no more than 30 minutes to complete, in practice, many faculty developers find that it takes far longer because of the need for amplification and clarification. Third, probably the most important advantage of the QCD over the SGID lies in the ease with which valuable reports are generated. With an SGID, the facilitator must prepare a formal letter to the client summarizing the data, making judgments about what to include and what to omit, and offering recommendations. Such a report can take hours to prepare because it involves not only cluster analysis to identify trends but also requires the careful selection of representative comments and careful attention to the tone of the letter. The QCD reports, on the other hand, can be prepared by a third party, usually an assessment expert, who becomes skilled at “pouring” the acquired data into templates. Thus, often overworked teaching center staffs are able to produce valuable assessment data within a fraction of the time required for the SGID analysis. Fourth, the Excel histograms and color-coded Word tables provide impressive evidence of assessment efforts. Accrediting bodies—including the Higher Learning Commission, the Association to Advance Collegiate Schools of Business (AACSB)—and Accreditation Board for Engineering and Technology (ABET)-have all reacted with high praise to the reports generated by the QCDs and focus groups. These reports, unlike the highly personalized SGID letters, allow systematic longitudinal analysis. The histograms, for example, reveal highly visible changes in satisfaction levels. It is easy to build charts comparing semester-by-semester or year-by-year responses. The color-code Word tables, placed side by side, quickly reveal whether or not deliberate changes have had the desired effect. If, for example, students one semester identified a poorly written textbook as a course weakness and the textbook was changed, then the color identified with textbook issues would, during follow-up QCDs or focus groups the next semester, ideally drop off the “weakness” chart.
SGIDs, perhaps because they are more labor intensive, do offer some advantages over QCDs. For example, they yield examples and suggestions for changes that might not emerge in the QCD protocol. Because of the extended discussion where the team reports are recorded on the board, the facilitator has more opportunities to engage the students in thoughtful reflection. Most SGID facilitators, for instance, challenge students to refine phrasing so that it is as constructive as possible, a valuable learning experience.
Savvy faculty developers will keep both options in their toolboxes, offering clients with limited time the choice between either a QCD or an SGID. Clients wanting more in-depth feedback and who are willing to invest 50 minutes, will benefit from the full-blown focus group with a transcript of students’ responses to open-ended questions.
CONCLUSION
The focus group protocol is highly versatile. We have used it, for example, to assess several regional and national conferences, including the American Association for Higher Education assessment conference. Our sessions in the last time slots enabled us to model the focus group process while collecting genuine data from conference attendees. Additionally, three faculty focus groups formed the heart of an off-campus consultation to determine faculty perceptions of their professional development opportunities at a small liberal arts university. In this case, faculty “scribes” captured the gist of the open-ended comments, thus eliminating the need for an audiotape and later transcription. Not surprisingly, we learned that focus groups produce rich data from faculty members as well as from students. This versatile protocol can be used with virtually any type of constituency.
The most valuable application of focus groups remains our own use of them for course and program assessment. Because the popularity of this assessment tool has grown incrementally, many clients include focus groups in their course syllabi. Many cooperative focus groups conclude with the following open-ended question: “Please comment on the value of this focus group session.” This request invariably draws positive responses from cadets who have been asked to complete numerous assessment instruments, tests, and activities since they first set foot on the Air Force Academy. Without exception, all student focus groups have indicated that they like the purpose and format of these structured interviews. They enjoy the informal interactions far more than paper-and-pencil surveys or bubble sheets. They are flattered that their opinions matter.
CEE can genuinely assure cadets that their opinions matter. Many positive course changes have resulted from the student input gathered through these cooperative focus groups. For example, as a result of focus groups, the law department made significant changes in their course structure and content, in their textbook, and in their methods of evaluation. A core course melding geography and meteorology adopted new texts and activities that fostered integration. As a result of focus group feedback, faculty are also working to incorporate active learning and cooperative learning approaches into this core course. An experimental first-year course in problem-based engineering added more “scaffolding” for students over a three-year period. A pilot “First-Year Experience” course will undergo major changes in its next iteration, due, in part, to data captured through focus groups with students. The management department made key changes in their core course and in their major (“closing the loop”) prior to their highly successful accreditation review by the AACSB. The list could continue almost indefinitely.
CEE staff members are delighted with the positive results of focus groups and the spin-off QCDs, even though our workload has increased as a result. For more information, including sample reports, please go to www.usafa.af.mil/dfe/.
NOTE
Special thanks go to Lt. Col. (Ret.) Marie Revak, former Director of Academic Assessment, and to Mr. Curtis Hughes, Deputy Director, Academic Assessment, U.S. Air Force Academy, who developed the Excel histogram and Word table formats. Mr. Hughes also provided editorial insights for this chapter.
REFERENCES
- Clark, D., & Redmond, M. (1982). Small group instructional diagnosis: Final report. Washington, DC: Fund for the Improvement of Postsecondary Education. (ERIC Document Reproduction Service No. ED2l7954)
- Diamond, N. A. (2002). Small group instructional diagnosis: Tapping student perceptions of teaching. In K. H. Gillespie (Ed.), A guide to faculty development: Practical advice, examples, and resources (pp. 82-91). Bolton, MA: Anker.
- Wright, S. P., & Hendershott, A. (1992). In D. H. Wulff, & J. D. Nyquist (Eds.), To improve the academy: Vol. 11. Resources for Faculty. Instructional, and Organizational Development (pp. 87-104). Stillwater, OK: New Forums Press.
- Wulff. D. H. (1996, Fall). Small group instructional diagnosis (SGID). Training workshops conducted at the U.S. Air Force Academy.
Contact:
Barbara J. Millis
Director of Faculty Development
HQ USAFA/DFE
2354 Fairchild Drive, Suite 4K25
USAF Academy, CO 80840-6220
Voice (719) 333-2549
Fax (719) 333-4255
Email [email protected]
Barbara J. Millis, Director of Faculty Development at the U. S. Air Force Academy, frequently offers workshops at professional conferences (American Association of Higher Education [AAHE], Lilly Teaching Conferences, Association of American Colleges and Universities [AAC&U], etc.) and for various colleges and universities. She publishes articles on a range of faculty development topics, and co-authored with Philip Cottell Cooperative Learning for Higher Education Faculty (Oryx Press [now Greenwood]) and with John Hertel Using Simulations to Promote Learning in Higher Education (Stylus Press). Appearing shortly from Stylus Press will be Using Academic Games to Enhance Learning in Higher Education. Her interests include cooperative learning (see Enhancing Learning—and More!—Through Cooperative learning: http://www.idea.ksu.edu/papers/Idea_Paper_38.pdf), peer review, academic games, classroom observations, microteaching, classroom assessment/research, critical thinking, how students learn, and writing for learning. After the Association of American Colleges and Universities selected the U.S. Air Force Academy as a Leadership Institution in Undergraduate Education, she began serving as the liaison to the AAC&U’s Greater Expectations Consortium on Quality Education.
APPENDIX 8.1
SOME OPEN QUESTIONS: PHYSICS Focus GROUP
Assigned Number:
Please Indicate your section number:
1) Why do you think your instructor assigns preflights?
2) How are the preflights used during class time?
3) Do you think that doing the preflights is a good use of your study/preparation time? Please explain.
4) Compared to your other activities, what is your estimate (hours spent or percentage of your time) of time spent with physics?
5) On a scale of 1 to 5 (1 = lowest and 5 = highest) rate your amount of effort in the course:
Compared to other students in your section? | 1 | 2 | 3 | 4 | 5 |
Compared to your other courses? | 1 | 2 | 3 | 4 | 5 |
Compared to the amount of effort you might have extended? | 1 | 2 | 3 | 4 | 5 |
Comments: |
6) Do computer network problems impact your work in this course? If so, what percent of the time?
0%-10% 11%-20% 21%-30% 31%-40% 41%-50% More?
Comments:
7) On a scale of 1 to 5 (1 = lowest and 5 = highest) how would you rate the effectiveness of the evaluation methods (quizzes, papers, exams, etc.) used in the course? Please add explanatory comments.
Comments: 1 2 3 4 5
APPENDIX 8.2
SAMPLE Focus GROUP QUESTIONS FOR MGT 210
(The Core Management Class Required of all Students)
What were your expectations for Mgt 21O? Were they met?
What management skills are important to you? Has Mgt 210 helped you with chem?
Are there topics presently covered in Mgt 210 that should be omitted? Please elaborate.
How do the evaluation methods used in Mgt 210 allow you to demonstrate your level of mastery of course material? Please explain.
What changes in the evaluation methods would you recommend? Please explain.
How did the teaching methods used in Mgt 210 help you learn?
In what ways did your instructor contribute to your experience in Mgt 210?
What is the most important thing you’ve learned from Mgt 21O?
What do teachers in other courses do that significantly enhance your learning?
APPENDIX 8.3
ROUNDTABLE ACTIVITY # 1
Index card numbers (top right corner): _________________________________________
Passing this sheet of paper rapidly from one person to another, please jot down all of the relevant strengths of the course, saying them aloud as you write.
Working as a team, rank order the strengths you identified, with the most important ones at the top of your list. Rank at least three by writing the numbers “l,” “2,” and “3” next to the strengths you identified.
ROUNDTABLE ACTIVITY # 2
Passing this sheet of paper rapidly from one person to another, please jot down all of the “negatives” of the course—the things you would change—saying them aloud as you write.
Working as a team, rank order the weaknesses you identified, with the most significant ones at the top of your list. Rank at least three by writing the numbers “1,” “2,” and “3” next to the weaknesses you identified.
APPENDIX 8.4
INDEX CARD DATA
APPENDIX 8.5
MGT 210, Focus GROUP
November 14, 2001