7 From SGID and GIFT to BBQ: Streamlining Midterm Student Evaluations to Improve Teaching and Learning
Skip other details (including permanent urls, DOI, citation information)
This work is licensed under a Creative Commons Attribution-NonCommercial-NoDerivatives 4.0 International License. Please contact : [email protected] to use this work in a way not covered by the license.
For more information, read Michigan Publishing's access and usage policy.
Faculty members want feedback about ways to improve learning. Midterm assessments are more useful than end-of-term student evaluations. Not all institutions provide faculty development consultants. This chapter presents an innovative process appropriate for institutions currently without teaching enhancement centers. The Bare Bones Questions (BBQ) process consists of empathic trained colleagues facilitating students’ evaluative discussions. Students and faculty members are overwhelmingly positive about the process piloted for the past three years. Students’ suggestions can include simple changes in classroom environment or enhanced sensitivity to cultural diversity. BBQ may build intra-institutional collegiality by reducing the isolation of teaching.
Not all institutions support faculty development centers, but their faculty want to improve student learning. This chapter presents an innovative, collegial-based approach to midterm student evaluations that can be available to faculty members without access to trained development specialists. Our innovation, Bare Bones Questions, or BBQ, was developed specifically from faculty development research including the venerable Small Group Instructional Diagnosis (SGID) (Redmond & Clark, 1982) technique of the 1980s and the Group Instructional Feedback Technique (GIFT) (Angelo & Cross, 1993) of the 1990s. The Bare Bones Questions process has been piloted and modified for six long semesters. Making this collegial process viable at an institution may reduce the isolation of college teaching and encourage administrators to fund faculty development centers.
THE VALUE OF MIDTERM CONSULTATION AND STUDENT FEEDBACK
Faculty development literature provides valuable insight into ways of improving teaching and learning. Models include classroom observations with feedback from trained consultants. Teaching, evaluated for the purpose of enhancement, is dramatically improved as evidenced by scores on student ratings (Brinko & Menges, 1997). Instructional consultation often involves four basic components: the initial contact, a pre-visit conference, information collection in a classroom, and a feedback session when problems are diagnosed and solutions explored. Teachers report faculty development consultation is beneficial for many years.
But what exists for faculty members without access to faculty development experts? Most institutions offer structured end-of-semester student evaluations, but their usefulness for improving teaching and learning is questionable. Many standardized questionnaires purport to measure teaching but apply only a few dimensions using Likert-scale responses. “Teachers have every right to be demoralized by such a simplistic approach—the nuances of teaching cannot possibly be captured this way” (Palmer, 1998, p. 143). A study of student satisfaction with end-of-course-student-ratings (ECSR) discovered student dislike of ECSR forms because responses cannot be explained and questions are repetitive (Wulff, Staton-Spicer, Hess, & Nyquist, 1985). Students did not believe ECSR comments were taken seriously or were used to improve teaching. Students were most satisfied with methods that resulted in instructors using their feedback to make changes during a semester, so they could benefit promptly from suggestions. Students also preferred expressing their opinions orally during group interaction. The authors concluded that the most useful data reflects the unique complexity and context of each individual classroom. Other faculty development experts recognize that end-of-term evaluations offer few solutions for instructors and may be denigrated by students since a course is basically finished by the time students do their evaluations.
Small Group Instructional Diagnosis
Small Group Instructional Diagnosis was developed by D. Joseph Clark at the University of Washington as an alternative to both end-of-semester student evaluations and expensive, elaborate, and more time-consuming faculty consultation procedures (Redmond & Clark, 1982). The SGID process consists of five steps. First is an initial conference with a consultant where instructors explain their teaching style and interests. Next comes the classroom visit. The instructor introduces the consultant and leaves. Singly or in small groups, students answer three questions: 1) What do you like about the course? 2) What do you think needs improvement? 3) What suggestions do you have for bringing about these improvements? After about ten minutes, the consultant brings the whole class together and requests the most important answers. These are written and given to the instructor. At their next conference, instructor and consultant discuss alternative approaches to the class and what the instructor might say to students at the next class meeting. In step four, the instructor uses about ten minutes to clarify student opinion, respond to student feedback, and summarize intended changes. The final step is a follow-up conference with the consultant later in the semester to discuss the success of the changes. The authors suggest the main advantages of SGID are effectiveness, it takes only 30 minutes of class time, and both instructors and students react positively to the experience.
Coordinators, faculty members, and students rated SGID superior to end-of-semester evaluation questionnaires. SGID students showed a significantly higher level of motivation. “These results . . . indicate that students who had the opportunity to voice their concerns at midterm were more favorably disposed to the instructor’s efforts, with a resulting change in their motivational output toward the end of the course” (Redmond & Clark, 1982, p. 10). SGID was endorsed as the principal method of formative classroom assessment even for mature, tenured faculty (Bennett, 1987). It was used for fine-tuning teaching, assessing textbook and instructional changes, investigating problems, and improving classroom climate. The instructor’s return session with students is a unique teaching opportunity for now very receptive students. Students, accustomed to receiving rather than contributing, report that “. . . their appreciation is profound when dialogue places chem in an active role” (Bennett, 1987, p. 103).
Consultants at the University of Michigan’s Center for Research on Learning and Teaching (CRLT) added “collecting small-group feedback from students” to their existing consulting process (Black, 1998). Elaborate training sessions for consultant-facilitators include reading, practicing data recording and giving feedback, videotaping, and role-playing. SGID is used for consulting with teaching assistants, new faculty members, experienced faculty who are developing new courses or encountering a difficult group of students, departmental development programs, and any faculty members who want to improve teaching. According to Black, a disadvantage is SGID takes a lot of time, at least four hours, for both consultant and instructor, but all who use SGID learn an incredible amount about teaching.
Group Instructional Feedback
In higher education literature, an SGID-like process, titled Group Instructional Feedback Technique, appears as one of 50 classroom assessment techniques (Angelo & Cross, 1993). GIFT is referenced in a section on “Techniques for Assessing Learner Reactions to Instruction.” Teacher evaluations are used for reappointments and tenure and promotion, but few help faculty improve their teaching. GIFT was designed to “capitalize on the ability of groups to give more comprehensive and useful feedback than individuals” (Angelo & Cross, 1993, p. 321). Student feedback is a GIFT in two senses of the word: Instructors get organized data filtered through a detached but sympathetic information gatherer. GIFT differs only slightly from SGID. A visiting assessor asks previously agreed-upon precisely worded questions. Students take three to four minutes to write answers alone and another three to four minutes to compare answers with others. Common responses are put on the board and students are asked to raise hands to indicate percentage agreement. When comparing GIFT in terms of cost in time and energy, the authors rank it “medium” for faculty preparation and student response, and “medium to high” for faculty analysis of collected data. Modified questions are:
What aspects of the course assist you in learning? What aspects of the course environment hinder your learning? Give some suggestions to improve the learning environment of the course. Has this course been what you expected when you signed up for it? Explain. (Santanello & Eder, 2001, p. 6)
The authors emphasize a pre-classroom visit so a colleague will know as much as possible about a course in advance, including items the “host professor” wants investigated and what areas they do not want discussed. The feedback session is “a sharing of pedagogical techniques and experiences to be followed by a thoughtful and scholarly written report to the host professor” (Santanello & Eder, 2001, p. 7).
DEVELOPMENT AND PILOTING OF BBQ
Our interest in midterm student evaluations came from a faculty-initiated discussion group called Learning Innovators. BBQ is both faculty directed and administered. Our provost recently began funding formation of a Teaching-Learning Enhancement Center. His support is partially due to faculty members’ persevering efforts to improve teaching and learning. BBQ was developed because we were interested in improving teaching but lacked access to consultants. Our first innovation was making the process collegial; we had to rely on each other rather than on faculty development experts. Our work somewhat resembles peer partner programs (Morrison, 1997).
After reviewing faculty development research, we drew the most salient parts of SGID and GIFT and shaped them to fit our faculty and institution. Our primary model, SGID, was developed to shorten the time consultants spent arriving at solutions for faculty members (Redmond & Clark, 1982). Time is an issue for all faculty members, as well as for consultants. BBQ is designed to provide maximum amounts of valid and useful information for faculty at the least possible cost in time, since most who teach feel overwhelmed by expectations for preparing and teaching classes, attending meetings, performing service, conducting research, writing, and publishing.
A second innovation involved streamlining the process. The first fall we followed a traditional SGID model, including the pre-visit conference focusing on “host” faculty members’ concerns. This took at lease 30 minutes. We arrived at the beginning of a three-hour class, and were introduced as a fellow teacher observing class who would then facilitate a group evaluation at the end. We observed and wrote down observations until 30 minutes remained in the class period. The instructor left the room and SGID began. Students were divided into small groups of three to four, and each group given a sheet of paper with three questions and about three inches of space for answers below each question. Each group answered the following questions: What does this instructor do in this class that helps you learn? What hinders your learning in this class? What are one or two specific suggestions of ways to improve your learning in this class? After about 15 minutes, when the conversational buzz was almost gone, we took the entire class through all three questions again, asking each group to report what was most important. Group answers were written on the blackboard, class consensus confirmed on each statement. A student copied answers off the board as discussion was facilitated. Students were thanked for their input, groups’ written answers were gathered up, and class dismissed. The typing of observations and student feedback took at least four hours. Additionally, there was the time for the follow-up visit with colleagues to relate students’ answers. The host-instructor was always delighted with very useful student input, but the process took facilitating colleagues at least eight hours or the equivalent of one full day of work. Clearly, this process was too time-consuming for even the most accommodating and empathic colleague to take on in addition to existing teaching duties.
To save time, the following spring we eliminated the pre-visit conference between instructor and colleague-facilitator and asked the same three questions in every class. We shortened classroom observation and scripting to an hour, but the process was still too costly in time. In the next iteration, classroom observation was eliminated and we simply facilitated student discussions in the last 30 minutes of class after instructors left the room. This was doable from a busy faculty member’s perspective and still produced rich, useful data. Students continued to be overwhelmingly positive about the process and even asked us to come into other classes. We explained we only come at the invitation of faculty members. Students are awed by the fact that their instructors are interested in student suggestions. We named the process BBQ for Bare Bones Questions. It is too attenuated to be a true SGID but more organized and collegial than GIFT. We demonstrated our innovation at two teaching conferences that summer and fall.
Demand for BBQ services soon outstripped time available since we all carry full-time teaching loads. This led to yet another innovation we call “collegial training.” Training consists of three parts: 1) A faculty member observes the BBQ process facilitated in a colleague’s classroom with permission of the colleague, 2) BBQ is carried out in the new colleague’s class by an experienced facilitator, and 3) the “new” colleague then facilitates BBQ in another colleague’s classroom. In one and one-half hours, instructors learn about the process through observation, by receiving information about their own classes, and by facilitating BBQ for a colleague. A new BBQ facilitator colleague can now be added to the list. The second fall we accomplished 13 BBQs, including both graduate and undergraduate classes. We covered all teaching time slots—morning, afternoon, and night. Each semester more faculty members become involved. BBQ is truly a bare bones question process since we have pared down the time faculty spend, but they can still discover what students believe will improve their learning. Information comes in time for faculty members to make course changes in response to student suggestions. We are not trained faculty development consulrants; we are colleagues trying to help each other improve student learning. Our recommendations for implementing BBQ can benefit faculty members at institutions lacking faculty development centers.
RECOMMENDATIONS FOR IMPLEMENTATION OF BBQ
The Bare Bones Question process, or BBQ, is most useful when it takes place about midterm, while there is still time to make changes in a course. BBQ should take place after a major grade is given to students, for example, about two weeks after a major test. If colleagues hear about BBQ, but are not sure about having it done, we encourage them to observe the process in a colleague’s class with that colleague’s permission. Coordination between “host instructor” and “colleague facilitator” is accomplished quickly by phone or email. There is no preliminary consultation since all classes are asked the same three questions. We usually facilitate BBQ the last 30 minutes of a class period. An exception is night classes held from 7:00 to 9:50. It is too much to ask a colleague to come at 9:20 but we found starting at the beginning of class meant delays due to the late arrival of some students. Our current night class solution is to facilitate BBQ the last 30 minutes of the first half of class before the break. In three-hour classes that break comes about halfway through a class period. We agree with Tiberius (1997) that teachers should not perform evaluations in their own classes, although Angelo and Cross (1993) propose this as a last resort. An alternative suggestion is having it done by a committee of students. For us, the idea of doing it ourselves or using student committees raises red flags about the validity of what students will say.
We recommend instructors tell students at the beginning of class that a colleague is coming to discover their ideas about ways to improve learning. When facilitating colleagues appear, they are introduced and the instructor leaves. To encourage student comfort we tell a little about ourselves, such as courses we teach, about Learning Innovators, and our own interest in improving learning. We acknowledge that students are accustomed to end-of-semester evaluations, but the purpose of BBQ is to identify improvements while the class is ongoing. We praise the instructor for wanting their students’ opinions. We read the three questions and explain they will be discussed first in small groups and then as a whole class. We assure students that 1) this is being done because their instructor cares about students and wants to improve their learning, 2) what students say is confidential, will be summarized for their instructor, and no student will be identified, 3) their instructor is the only person who will see student comments, and 4) student information will be aggregated and presented to the instructor as coming from the entire class and this is why agreement is so important. We find putting group answers on transparencies is more useful than blackboards or flip charts. Back in our offices we type students’ opinions right off of the transparencies. Students watch to see that we print exactly what they say. If a group reports something another group has already said then we put check marks next to that answer. In our report to instructors, we might make a note of the fact “five groups mentioned this helps.”
Some experts argue in favor of facilitation by colleagues in the same discipline to better understand what students say, but we disagree for the sake of validity. To ensure truthful student answers we believe colleague pairings should be across disciplines or schools. Central to validity is anonymity for students who may be suspicious of facilitating faculty members who know them from other classes. Some students are afraid a facilitator may remember who said what and tell an instructor. In BBQ, a colleague-facilitator is only required to be an intelligent and honest transmitter of information. We have tried it both ways and believe validity issues outweigh other concerns.
In the literature, a 24-hour turnaround is suggested, but we recommend feedback be given to instructors at least before the next class meeting. This gives instructors time to organize their response to student suggestions and think about possible changes. In the literature, opinions vary as to the form feedback should take. We suggest it be typed and handed to an instructor at a face-to-face meeting. Interacting face-to-face allows the receiving colleague an opportunity to ask questions to about the report. We tried saving time by eliminating this conference and using email or campus mail, but were dissatisfied. We find student responses always generate questions from faculty and clarification is best done face-to-face. Our current approach is for facilitating colleagues to make appointments with instructors, go to their offices, and explain they are empathic colleagues and NOT trained faculty development consultants. Instructors then read students’ feedback. The facilitating colleague may inquire if students’ comments ring true or make sense. Instructors can ask for clarification on items. This feedback session gives both instructors and facilitating colleagues a sense of accomplishment and closure. At their next class meeting instructors thank students for their input, summarize it, ensure accuracy, and discuss possible course changes for the remainder of a semester. This, in turn, also gives students a sense of acknowledgement and closure.
DECORUM AND ETHICS IN COLLEGIAL FEEDBACK
Based on experience, we believe strongly that BBQ colleagues should act as empathic peers in the truest sense. Within a collegial pair neither person is the “expert.” A facilitating colleague is not a consultant but a conduit of student statements. If expertise is assumed to reside in a facilitator colleague then a receiving colleague may perceive student dissatisfactions as valid if accompanied by advice from a facilitating colleague. Palmer (1998) proposes respecting each other’s vulnerability, although it is difficult to resist the temptation to make suggestions. Norms in academia lead us to believe “. . . we were put on earth to advise, fix and save each other, and whenever an opportunity to do so presents itself, we should seize it!” (Palmer, 1998, p. 151). Our BBQ volunteer and paired colleagues report being reassured by sharing student comments because results dramatically demonstrate commonalities in student suggestions regardless of discipline.
A major emphasis is that instructors read and understand exactly what their students said. Students’ ideas accurately conveyed enhance the richness and value of the data. By following this method of operation for several semesters neither resentment nor hostility has been generated among cooperating colleagues since we are equals or peers helping each other. Lenze (1997) believes a meaningful, nonthreatening feedback session is crucial because the receiving instructor is vulnerable; her experience is that most instructors react favorably because SGID is “. . . a concrete, confirming, constructive and thoughtful” (p. 146) experience. After answering any questions about students’ remarks, an appropriate closing statement is, “Thanks for inviting me to your class, I enjoyed working with you. If you have any questions or comments about the feedback feel free to let me know” (Border, 1997, p. 24).
Ethics in consulting should be applied in this peer-based process (POD Network, 2001). Trust is an essential component between colleagues who pair up for BBQ. The instructor must feel safe and be assured that what students say will be held in confidence by the facilitating colleague. As is true for consultants, BBQ results should not be discussed with any person other than the teaching colleague. “No information should be given to a teacher’s supervisors, to other teachers, nor should it be used in any written correspondence about a teacher” (Border, 1997, p. 19).
A related issue is that instructors should be able to trust that feedback relayed by a colleague is truthfully what students said. Wilbee (1997) suggests that only helpful feedback should be given about things that a teacher can actually change, and Lenze (1997) suggests altering the feedback report if students’ comments are more critical than positive in nature. We disagree with the idea of filtering negative student comments. We recommend all feedback, representing class consensus, be included. Receiving colleagues must believe facilitators are honest in their reports, neither exaggerating nor enhancing student comments with their own “spin.”
We also agree that the most useful feedback is based on teacher behaviors, reflects students’ needs, and suggests ways to improve learning. Student input about teaching behavior is central for instructional improvement. For example, a common complaint is teachers not leaving slides up long enough for students to copy what they think is important. Useful feedback is, “It would help learning if a copy of transparencies or slides could be put on reserve in the library or posted on the web.” Many changes are fairly simple. For example, it could be assumed that providing detailed outlines of each class lecture/discussion is helpful. However, some students are accustomed to lectures based solely on textbooks and may wonder how the numbered outlines relate to the textbook chapter numbers. Implementing BBQ can bring this confusion to light and result in a change in the course, such as typing corresponding chapter numbers at the top of lecture/discussion outlines or writing “This is NOT in the textbook” or “Some of this material is found at the beginning of Chapter 5.” After BBQ, one computer engineering instructor eliminated the last three required class projects, saying he agreed with students it was too much work for one semester. Another faculty member did not realize how distracting hallway noise was to students since her back was to the doorway. Thanks to BBQ, this situation was brought to her attention and now she closes the door when she begins teaching.
BBQ BENEFITS ALL, STAKEHOLDERS
When learning is improved, there is a win-win-win situation. Faculty members, students, and institutions all reap the rewards. Personal benefits for faculty members occur when our own classes are visited but we also learn by hearing from colleagues’ students. Through round-robin collegial training, more and more faculty members are drawn into the practice of listening to students and improving courses accordingly. Faculty members actually have a minimum of two opportunities to hear from peers’ students about what is helpful and what hinders. After these two experiences, faculty members understand the validity of data generated in their own classrooms. Observation in other teachers’ classrooms also takes much of the “sting” or “pain” from what could be an ego-bruising experience. Both experienced and fledgling teachers report being enlightened by students’ comments. There is no other way faculty can get this information. Faculty members worried about colleagues hearing student comments that might reflect badly on their teaching are relieved of this concern after hearing from students in colleagues’ classes.
When midterm evaluations are peer facilitated another benefit occurs for faculty. One challenge to postsecondary teaching is what Palmer (1998) refers to as the isolation of teaching.
Academic culture builds barriers between colleagues even higher and wider than those between us and our students. These barriers come partly from the competition that keeps us fragmented by fear. But they also come from the fact that teaching is perhaps the most privatized of all the public professions. (p.142)
Palmer writes that although there are no formulas for good teaching it may help to talk to fellow teachers in a community of pedagogical discourse. BBQ faculty members, as colleagues, feel less isolated. Since it is voluntary, BBQ may also ameliorate faculty fears about both peer review and student evaluations. By increasing collegiality and improving student learning and attitudes, BBQ resolves several potentially negative situations facing faculty members.
Midterm assessments benefit students in several ways. Angelo and Cross (1993) believe GIFT helps students develop an ability to draw inferences, evaluate teaching methods and materials, work productively with others, and cultivate a commitment to honesty. When students think about their own learning processes learning improves. Tiberius (1997) describes SGID as “. . . a combination of a leaderless small group discussion followed by an unstructured, whole class interview . . .” (p.60). Since student groupings are leaderless there is less inhibition to speaking frankly. Students are flattered by teachers’ interest in their opinions, especially if teachers take some of their suggestions. With BBQ, teachers find out how students believe learning could be improved. Truthfulness is improved by having students speak after a faculty member has left a classroom. Black (1998) cautions that if an instructor ignores students’ suggestions then class atmosphere could be harmed and negative comments could snowball as students share frustrations. As an alternative, she suggests individual responses followed by a time-consuming analysis of patterns by consultants. We have not found this to be a problem. Our students are very vocal. We always emphasize the importance of whole class consensus and assure students they do not have to find something wrong with a class. We do ask for each group’s most important responses and we complete the first question before moving on to the next. We also alternate the order in which groups speak. Our students are comfortable about saying “We disagree!” when other groups make their statements. In a few classes the students appear to be evenly divided on what would help. In such cases, we put on our report chat there was disagreement.
CONCLUSION
BBQ is truly a bare bones process, paring down time spent by faculty interested in knowing what their students think at midterm. We began using the traditional five-step SGID, including observation and scripting of colleagues’ three-hour classes followed by students’ discussions. The process, including preparing the report, took about eight hours. This was too much to ask of even our most empathic colleagues. We streamlined BBQ over several semesters. While time is saved, neither informational quality nor validity is lost. BBQ offers many of the benefits of SGID with the exception of advice from faculty development experts. By using teaching peers, BBQ saves time and money. It eliminates the pre-classroom conference and shortens the post-classroom session. It simplifies midterm student evaluations by making them a colleague-to-colleague service. By interacting with students from colleagues’ classrooms, faculty members gain a new competency, as well as a different perspective. Students benefit by hearing whether or not their opinions are similar to or different from other’s opinions. In this way students “. . . begin to understand what sort of a challenge confronts an instructor trying to provide worthwhile learning experiences for students who have different learning needs and expectations” (Weimer, 1990, p. 107). Students also learn to express constructive criticism of teaching and better understand their experiences in other classrooms. BBQ is a valid way to discover what students believe will improve their learning and may even be an option for faculty development experts overwhelmed with demands for their services. For thousands of institutions without consultants, BBQ is a viable alternative with significant benefits for all participants.
NOTE
BBQ was first presented at 2001 Texas Lilly Conference on College and University Teaching at Southwest Texas State University in San Marcos. Based on new research findings, a revised format and an interactive session was presented at the 9th Annual Meeting of the Southwestern Business Administration Teaching Conference at Texas Southern University later that same year.
REFERENCES
- Angelo, T. A., & Cross, K. P. (1993). Classroom assessment techniques. San Francisco, CA: Jossey-Bass.
- Bennett, W. E. (1987). Small group instructional diagnosis: A dialogic approach to instructional improvement for tenured faculty. The Journal of Staff, Program, and Organization Development, 5, 100–104.
- Black, B. (1998). Using the SGID method for a variety of purposes. In M. Kaplan & D. Lieberman (Eds.), To improve the academy: Vol. 17. Resources for faculty, instructional, and organizational development (pp. 245–262). Stillwater, OK: New Forums Press.
- Border, L. L. B. (1997). The creative art of effective consultation. In K. T. Brinko & R. J. Menges (Eds.), Practically speaking: A sourcebook for instructional consultants in higher education (pp. 17–24). Stillwater, OK: New Forums Press.
- Brinko, K. T., & Menges, R. J. (1997). Practically speaking: A sourcebook for instructional consultants in higher education. Stillwater, OK: New Forums Press.
- Lenze, L. F. (1997). Small group instructional diagnosis (SGID). In K. T. Brinko & R. Menges (Eds.), Practically speaking: A sourcebook for instructional consultants in higher education (pp. 143–146). Stillwater, OK: New Forums Press.
- Morrison, D. E. (1997). Overview of instructional consultation in North America. In K. T. Brinko & R. J. Menges (Eds.), Practically speaking: A sourcebook for instructional consultants in higher education (pp. 121–129). Stillwater, OK: New Forums Press.
- Palmer, P. J. (1998). The courage to teach. San Francisco, CA: Jossey-Bass.
- POD Network. (2001). Ethical guidelines for educational developers. In D. Lieberman & C. Wehlburg (Eds.), To improve the academy: Vol. 19. Resources for faculty, instructional, and organizational development (pp. xvii–xxiii). Bolton, MA: Anker.
- Redmond, M. V., & Clark, D. J. (1982). Student group instructional diagnosis: A practical approach to improving teaching. AAHE Bulletin, 34, 8–10.
- Santanello, C., & Eder, D. (2001). Classroom assessment techniques. Thriving in academe, 19, 5–8.
- Tiberius, R. (1997). Small group methods for collecting information from students. In K. T. Brinko & R. J. Menges (Eds.), Practically speaking: A sourcebook for instructional consultants in higher education (pp. 53–63). Stillwater, OK: New Forums Press.
- Weimer, M. (1990). Improving college teaching: Strategies for developing instructional effectiveness. San Francisco, CA: Jossey-Bass.
- Wilbee, J. (1997). Instructional skills workshop program: A peer-based model for the improvement of teaching and learning. In K. T. Brinko & R. J. Menges (Eds.), Practically speaking: A sourcebook for instructional consultants in higher education (pp. 147–156). Stillwater, OK: New Forums Press.
- Wulff, D. H., Staton-Spicer, A. Q., Hess, C. W., & Nyquist, J. D. (1985). The student perspective on evaluating teaching effectiveness. Association for Communication Administration Bulletin, 53, 39–47.
Contact:
Margaret K. Snooks
School of Human Sciences and Humanities
University of Houston-Clear Lake
Houston, TX 77058-1098
Voice (281) 283-3381
Fax (281) 283-3408
Email [email protected]
Sue E. Neeley
School of Business and Professional Administration
University of Houston-Clear Lake
Houston, TX 77058-1098
Voice (281) 283-3219
Fax (281) 283-3951
Email [email protected]
Kathleen M. Williamson
School of Business and Professional Administration
University of Houston-Clear Lake
Houston, TX 77058-1098
Voice (281) 283-3192
Fax (281) 226-7317
Email [email protected]
Margaret K. Snooks has been a faculty member in the School of Human Sciences and Humanities at the University of Houston-Clear Lake (UHCL) since 1991. She was recently appointed co-convener of the developing UHCL Teaching Learning Enhancement Center. Her research interests include ways to improve teaching and learning at the postsecondary levels of education. She teaches undergraduate and graduate courses in the Program of Fitness and Human Performance, including courses in health psychology and women’s health. In 2002 she published in Health Care for Women International and in Women in Higher Education: Empowering Change.
Sue E. Neeley is an Associate Professor and Coordinator of the Marketing Program at the University of Houston-Clear Lake. Her research interests include marketing strategy (particularly the relationship between market share and profitability), business-to-business customer relationship management, and the scholarship of the enhancement of teaching and learning. She has conducted numerous workshops on innovative methods of using mid-semester student feedback to improve learning and student involvement in the learning process. Her research has been published in the Journal of Business Research, Journal of Services Marketing, Mid-Atlantic Journal of Business, Journal of Business Education, and The Handbook of Business Strategy, as well as the proceedings of professional conferences.
Kathleen M. Williamson has been a faculty member in the Marketing Department of the School of Business and Public Administration at the University of Houston-Clear Lake since 1998. She teaches Principles of Marketing, Marketing Information, Integrated Marketing Communications, and E-Marketing Management to undergraduate and graduate students. Her research interests include Internet marketing, customer relationship management, and the scholarship of teaching and learning.