As online education grows, so does the need for professional development for faculty teaching online courses. This chapter explains how a faculty development technique, small group individual diagnosis, was adapted to the online environment. This technique provides faculty with anonymous midsemester feedback from students regarding the quality of the course. The timing and nature of this feedback is the impetus for a teaching consultation with a trained facilitator. In this study, surprising challenges emerged when the technique was adapted to the online environment. Faculty perceptions of the success of the online diagnoses are shared.

Over the past several years, online education has been growing rapidly in higher education. According to the results of a 2010 national survey administered by the Sloan Consortium and the Babson Survey Research Group of over twenty-five hundred accredited higher education institutions in the United States, "almost 4.6 million students were taking at least one online course during the fall 2008 term; a nearly 17 percent increase over the number reported the previous year." This means that "more than one in four higher education students now take at least one course online" (Allen & Seaman, 2010, p. 1). Palloff and Pratt (2003) report that according to the National Center for Education Statistics, about 60 percent of students take an online course during their college career.

With at least one-fourth of all American college students enrolled in an online course, and with these numbers continuing to increase, the need for faculty development supporting online course design and delivery is greater than ever before. However, "nearly one-fifth (19 percent) of all institutions do not provide any training (even informal mentoring) for their faculty teaching online courses" (Allen & Seaman, 2010, p. 11). Even among those institutions that provide professional development support for faculty making the transition to the online environment, over 30 percent of faculty surveyed perceived their institution’s support for online course development as "below average" (Seaman, 2009).

With a lack of professional development for online instruction at some institutions and with poor faculty perceptions of the quality of existing programs at others, there is clearly a need to investigate the effectiveness of how existing techniques can be adapted for professional development for online instruction.

Small Group Individual Diagnosis: Development and Previous Evidence of Success

Small group individual diagnosis (SGID) is a faculty development technique designed to provide instructors with midsemester feedback from students about the quality of their instruction. The SGID process, originally developed by D. Joseph Clark at the University of Washington’s Biology Learning Resource Center, was adapted from Melnik and Allen’s clinical model at the University of Massachusetts (Clark & Bekey, 1979; Bergquist & Phillips, 1977). Since its inception, the SGID process has been known by a number of names, including "group interviews" (Braskamp, Ory, & Pieper, 1981). It has been altered to meet the needs of specific institutional contexts, but the basic process of the technique typically contains several common elements.

During an SGID, a facilitator elicits feedback directly from students in the classroom. Typically conducted about halfway through the semester, this process is designed to help faculty strengthen their teaching during the remainder of the semester by providing them with anonymous, candid feedback on the course’s strengths and weaknesses. This carefully constructed process consists of several steps. First, the instructor meets with the facilitator to review the SGID process and schedules about thirty minutes of class time in which the SGID will take place. Next, on the day of the SGID, the instructor introduces the facilitator to the class, explains the purpose of the process, and leaves the room. The facilitator then divides the students into small groups; each group of students must come to a consensus about what they like about a course, what they do not like, and what suggestions they have for improving it (Coffman, 1998; Wulff, Staton-Spicer, Hess, & Nyquist, 1985). This information is then shared with the other students and with the facilitator. In some cases, the facilitator initiates a discussion about the results and asks the entire class to come to a consensus about the course’s strengths and weaknesses. After the session, the facilitator meets with the instructor in an individual teaching consultation to discuss the results as well as ways to improve the course. Finally, the instructor responds to the students’ feedback during the next class session (Wulff et al., 1985).

Some studies have been conducted on the effectiveness of SGID as a classroom-based evaluation technique. Seldin (1988) found that the SGID process opened the lines of communication between faculty and students. Creed (1997) reported that students who participated in the process felt that their voices had been heard, that the process brings back into the group those with extremely divergent views, and that "because an SGID provides more reflective feedback, the information is qualitatively different than that gotten in end-of-the-semester ratings." Seldin (1993) suggests that if course evaluations are to be used to improve instruction, they should be given within a semester so that instructors have a chance to adjust their teaching.

One of the strengths of SGID is the follow-up teaching consultation to discuss the results of the process. According to McKeachie and Kaplan (1996), consultation with an expert is an important factor in determining the amount of improvement that a teacher makes. Redmond (1982) explains several other factors of the SGID process that have an impact on faculty development. Since the instructor receives feedback from students around midterm, students have a genuine opportunity to evaluate the effect of their feedback and the receptivity and responsiveness of the instructor. Chen and Hoshower (2003) found that students generally consider the improvement of course content and format, thus creating an improvement in teaching as one of the most important outcomes of teaching evaluations. The major benefit SGID provides to students is the opportunity to compare views, which is not accomplished with traditional paper-and-pencil evaluations at the end of the semester (Redmond, 1982). Also, students participating in an SGID can impose their own priorities and values, as well as provide constructive suggestions on how to handle current problems (Schein & Bennis, 1965). Finally, small group research (Tubbs, 1997) generally supports the contention that using small group discussion for organizational decision making will result in more active acceptance of changes (Redmond, 1982).

Impetus for the Study

Because SGID has been an effective technique to improve instruction, as demonstrated through the literature and our own experiences in implementing this technique in traditional classroom-based courses, there was a growing interest-and need-to adapt this technique for online education. Also, since faculty prefer individual consultations with faculty development staff as the mode of training for learning to teach online (Kinuthia, 2005; Carroll, 1993; Gilbert, 1995), there was reason to suspect that individual consultations as part of the online SGID process would also be perceived positively by faculty. The purpose of this study was to evaluate faculty members’ perceptions of the effectiveness of SGID as a development tool when adapted to online instruction.

Only one other study analyzed the use of SGID when modified for distance or online education. Sherry and Burke (1995) evaluated the effectiveness of interactive television courses through online computer-mediated communication. The results of the study support the success of its translation to the online forum, stating meaningful communication from all participants. The courses in this study included those facilitated by video and two-way audio. This study builds on the previous findings by Sherry and Burke by investigating the application of SGID to contemporary, Internet-based courses that use course management software and other technologies that were not available during the 1995 study.

However, Millis and Vasquez (2010-2011) contend that quick course diagnosis is more efficient and effective than the SGID. Although the tools are quite similar, the use of SGID in online courses blends elements of each. According to Millis and Vasquez, conducting the SGID in Blackboard’s virtual classroom is similar to the collaborative brainstorming activity, or roundtable, data collection. Instant message chat supports rapid-fire comments that furthers idea development within the group. The transcribed activity is then saved to a file in order to avoid error that may occur in documenting the SGID in the traditional classroom. Furthermore, Millis and Vasquez argue that SGIDs take up too much class time. One of the major reasons students enroll in an online course is the ability to complete assignments asynchronously (Richardson & Swan, 2003; Drennan, Kennedy, & Pisarski, 2005; Watson & Rutledge, 2005). Conducting the SGID online avoids classroom disruption considering there are no set class times.

Framework and Methodology

In her 2004 article, "Using a Framework to Engage Faculty in Instructional Technologies," Nancy Chism outlines a conceptual framework that faculty developers can use to "better estimate the potential effectiveness of various strategies" (p. 39) for teaching faculty how to use technology effectively for instruction. Chism’s framework describes faculty learning as a cycle consisting of four repeating steps: planning, acting, observing, and reflecting on their teaching. Two components of her framework are key to evaluating the potential effectiveness of a faculty development program. First, Chism explains that transformative experiences within this cycle tend to occur when faculty recognize a problematic situation. In other words, when faculty realize that an aspect of their teaching is not working effectively, this is often the catalyst for change. Next, Chism recommends specific developmental approaches for each of the four phases of this cycle: if a faculty development program addresses the specific needs of the faculty in each of these phases, the program is more likely to be successful. This framework will be helpful to analyze whether the application of SGID to online classes has the potential to be effective according to Chism’s framework.

The study began by recruiting and preparing SGID facilitators as well as faculty participants. First, facilitators were selected: one was a staff member in the institution’s faculty development office; the other was a reference librarian who was also a member of an institution-wide faculty learning community focused on online education. Facilitators were selected based on two criteria. One was that they had to have experience with online education or at least a strong familiarity with research related to online education. For this reason, members of the faculty learning community, who all were familiar with the literature on online learning and demonstrated proficiency with educational technologies, were invited to apply. The other criterion for selection was that the facilitator could not be from the same college as the participant. The motivation for this criterion was to prevent possible bias and ensure confidentiality. For this study, a research librarian with expertise in technology and facilitation was selected to avoid potential intradepartmental conflicts.

Facilitators were provided with an informational packet that included detailed directions concerning their role in the process. The facilitators collaborated during all phases of the project to ensure that the process was consistent. Next, the sixteen instructors who were teaching online courses during the upcoming semester were invited to participate in the project; seven of the sixteen volunteered. The instructors who chose to participate selected one of their classes for the online SGID and asked the students to participate; participation was not mandatory or tied to the students’ course grades. The courses in the study encompassed five disciplines (education, philosophy, business, English, and computer science) and both undergraduate and graduate students.

Data were collected from participating faculty members through surveys and semistructured interviews. First, instructors completed an online needs assessment survey about their previous experience with online instruction, their current online course, their expectations for the SGID process, and their future goals and needs for professional growth related to online education. They were then matched with a facilitator; they met with the facilitator to review the process and select a date for the SGID to take place. Instructions were sent to students, who were told to log on to the course management system’s online chat feature at a designated day and time. In many cases, multiple sessions were arranged in order to accommodate conflicts with student schedules. During the online session, the facilitator asked students to discuss the three most positive aspects of the course, as well as the three aspects of the course that needed the most improvement. The facilitator summarized results of the chat, eliminating student names and identifying information to ensure anonymity. After the SGID, faculty reviewed these results with the SGID facilitator in a teaching consultation. The facilitator then interviewed the faculty member about the SGID process; this interview was recorded on video.

The surveys and interviews gathered information about faculty participants’ previous experience with online instruction and their challenges and expectations related to online instruction. These data provided valuable contextual information that assisted facilitators in both conducting the SGID and assisting the faculty member during the subsequent teaching consultations.

Participants

All participants completed an online background survey that provided valuable information about their experiences with online instruction. Five of the participating instructors were full time, and two were part time. Among the full-time faculty, all were tenure track; three were posttenure and two were pretenure. Participants varied in their previous experience with online instruction. (See Table 15.1.)

Table 15.1 Profile of Participating Instructors.
Participant NumberDisciplineNumber of Courses Previously Taught OnlineRank
1Education10Associate professor (full time)
2Education4Associate professor (full time)
3Commerce3Assistant professor (full time)
4Education3Lecturer (part time)
5Computer and information sciences2Professor (full time)
6English2Assistant professor (full time)
7Philosophy1Lecturer (part time)

The background survey revealed that one instructor had taught a blended course, which includes at least one face-to-face class meeting; the others taught completely online, asynchronous courses. Three participants had previous professional development related to online instruction. The average class size was twenty-six students, and only three instructors had used Blackboard’s chat feature, the tool used to conduct the SGJD. Six instructors indicated that they were participating in the SGJD because they were hoping to improve or refine their current course, two were planning to increase the number of online courses they were teaching, and four indicated that they were interested in continued professional development for online instruction.

Results

The purpose of this study was to evaluate faculty members’ perceptions of the effectiveness of SGID as a faculty development tool when adapted to online instruction. The researchers identified five indicators to measure whether the use of SGID was effective and perceived positively by faculty participants:

  1. The faculty member indicated that he or she personally benefited from the experience or felt that the experience would be helpful to others.

  2. The faculty member found the feedback to be helpful for understanding either students’ learning or whether the current course design was successful.

  3. The faculty member was able to plan course improvements based on data gathered during the SGID.

  4. The faculty member identified the need for or expressed interest in additional professional development as a result of participation in the SGID.

  5. The faculty member felt that taking the time to participate in an SGID was beneficial; he or she would participate in the process again in a future class.

These indicators were identified through the researchers’ perceptions of markers of a successful faculty development program; these perceptions were founded in experience with faculty development, as well as locally identified benchmarks at their institution.

The interview results below are presented by success indicator. Faculty interviews, which were conducted after the SGIDs took place, demonstrate that participating faculty perceived that SGID is a successful and useful tool for professional development and a beneficial vehicle for student feedback when applied to the online environment. Faculty responses for each of the success indicators appear below; responses were generally evenly distributed among the participants.

  1. The faculty member perceived that he or she personally benefited from the experience or felt that the experience would be helpful to others. Faculty members found the SGID to be "extremely helpful." One participant explained, "What I expected, what I hoped for, was to get some real evaluation back from my students. Students are afraid to tell a professor exactly what they want." Many noted that the SGID was a vehicle to give them timely, accurate, and in-depth feedback about the course. One participant explained,

[SGIDJ is the only feedback that [we} get other than the course evaluations. And with online classes, it’s an online course evaluation, therefore, they can opt to do it or not. It’s not like there’s a piece of paper in front of you on the desk and someone is waiting to take it away. In a lot of cases, what I have found is that everybody who hated the class complained in the evaluations At least in this case, you have pros and cons. Something like this is very helpful for an online class. It gives a more balanced perspective to the instructor.

Faculty also explained that they benefited because they learned more about their students as online learners. One faculty participant explained, "I guess [SGID) gives me a different view of how students work." Another explained that "many of the students were not familiar with online learning" and were "reticent in terms of what to expect" from the online course. She had "taken for granted" that the students were familiar with the required technology. Another participant felt that the process made her "more aware of how to approach online learning" and recommended that instructors "who are not really sure for whatever reason how their students are feeling about the course" participate in the process.

Instructors also felt that they benefited from SGID because it validated

their current practices or bolstered their confidence. Once faculty member remarked, "It really told me that I’m on the right path." Another person said, "Before, I was fumbling in the dark. I think the feedback is useful, especially for online classes, because we don’t get feedback otherwise. Although online teaching is so convenient and effective, I need positive reinforcement too." Another professor explained,"At the end, I got a few indications on their reflection papers that they really liked the papers rather than just the text. Other than that, I had no clue. I haven’t seen the official evaluations on the course. I think it went okay, everybody seemed happy, but this is nice to know that you interfaced with them, and they were saying positive things."

  • 2. The faculty member found the feedback to be helpful for either understanding students’ learning or understanding whether the current course design was successful. Each of the participants found student feedback to be helpful because it "validated the remarks that students made" in previous semesters’ course evaluations. One person explained that "there were some consistent remarks that I sort of suspected but that are confirmed." Another faculty participant felt that "there’s nothing really new here ... [but) some of these remarks confirm things that have been noted in the past" and that he was "not surprised" by the students’ comments.

Five of the participants noted that the students’ comments were more positive and helpful than they had anticipated. One of these faculty members noted, "There were some good responses in terms of how I can make the course better next semester for the students. So I really appreciated the feedback that I received from the study and from the students." Another explained that student feedback "surprised her on the pro-side" and helped her to understand "the things that the students like about the course, [such as] the consistency of the course, the organization of the course, [and] the feedback that I give to them. Interestingly enough, they don’t feel like they’re treated like students, but that they’re treated more like colleagues. I don’t know what I’m doing that they feel that way, but it’d be interesting to find out." Overall, in some cases, the data served to confirm previous course feedback; in other cases, the data helped the instructors identify successful elements in their approach to online teaching.

  • 3. The faculty member is able to plan course improvements based on the data gathered during the SGID. Each of the faculty participants planned on making improvements to their online courses based on the data gathered during the SGID. In each of the cases, the planned changes were the result of very specific student feedback. One faculty member planned to incorporate a live component to the course, either on-campus office hours or availability by Skype, after learning that her graduate students wanted to talk to her face-to-face. Another faculty member, who teaches a course that requires a twenty-hour service component, planned to reduce the number of hours to ten after learning that most of her students worked full time and had families to care for. However, she increased the number of text-based quizzes after discovering that they were surprisingly popular among the students and also "forced" the students to complete the required reading. A third faculty member planned to provide more technical support to help her students learn how to do voice-overs in PowerPoint for a class assignment. She is also going to improve the organization of online files so that her students will be able to find what they are looking for more easily. In each case, the changes are very specific improvements based on detailed student feedback.

  • 4. The faculty member identified the need or expressed interest in additional professional development as a result of participation in the SGID. Two faculty participants, both of whom had taught at least three courses online, felt that they did not need any further professional development. One of them explained that she was already proficient in the technology; the other said, "I don’t know that you really need training, other than knowing that the capability is there."

However, the less experienced faculty participants expressed a strong desire for continued support. One explained, "I have not read much literature-I probably haven’t read any of the literature-on how to conduct an online class. I’ve been feeling my way, and I probably would benefit from the experience of other instructors." Another participant advocated for additional training:

We need many more opportunities to continue learning about online instruction. There is so much technology out there that’s upgraded daily. I don’t want to become passe. I don’t want to be working in a medieval century when the students know much more about it than I do and can do many more things. I think we need to know what’s out there and how to incorporate that into online instruction. I also believe that we need to help new online instructors who are looking for support from experienced people. These results corroborate responses from the pre-SGID faculty survey. In general, participants who asked for more professional development were those who had less previous experience with online instruction.

  • 5. The faculty member felt that taking the time to participate in an SGID was beneficial and would participate in the process again in a future class. All participants interviewed were interested in participating in an SGID again. Two faculty members mentioned a willingness to participate because of a lack of time commitment on their behalf. One said that he would participate again because he does not "see a reason not to." Another said that she would participate again because the SGID was "nothing that I had to worry about."

Five participants explained that they wanted "to hear more" from their students because they "got a lot of valuable feedback" and that if "there are ideas out there [from students] that could help, I should be open to them." One said that she would "absolutely" participate in order to find additional ways to be innovative and that she "love[s) the fact that I’m part of the study, I think it’s a healthy experience for us all. We need to know how well we’re doing, and I think that this is one excellent way that can give us that measure, that opportunity." Finally, one instructor explained, "I’m really grateful for the SGID. Otherwise, I just had nothing to go on."

Discussion

Faculty responses demonstrated a positive view of the online SGID process. Analyzed through Chism’s conceptual framework, online SGID has demonstrated potential effectiveness as a faculty development approach for helping faculty learn how to teach online. Online SGID provides faculty with information about problems in a course: Chism argues that faculty recognition of problems in their teaching is often the impetus needed to spur faculty to change within the ongoing four-step learning cycle. As the results explained, the online SGID process resulted in faculty identifying areas for change within their courses, planning change, and expressing the desire for further participation in professional development. Feedback from students allowed faculty to identify and evaluate possible weaknesses in their course, as well as validate strong areas.

Chism also recommended specific developmental approaches for faculty in each of the four phases of the faculty learning cycle. She argues that faculty development programs should address the specific needs of the faculty in each of these phases to increase their chances for success. Chism explains that faculty in the reflecting phase benefit most from programs that help them reflect on the effectiveness of their current teaching practices, such as programs that provide them with information on their teaching or identify existing instructional needs. Online SGID provides this information, as well as the peer support that Chism recommends during this phase. Faculty in the planning phase benefit from additional ideas for teaching or information that helps evaluate the usefulness of ideas. Online SGID offers a space for this information sharing to take place during the teaching consultations that follow the online meetings with students. Faculty in the acting phase need support in implementing a teaching innovation, such as encouragement from a peer or instructional developer experienced in the approach. Relationships formed with the facilitator in the online SGID process can be extended to support faculty involved in this phase of the faculty learning cycle. Finally, faculty in the observing phase are searching for information about the effectiveness of a new approach; Chism specifically recommends "informal oral or written student reactions ... or using a mid-semester course evaluation process" (p. 43) to provide faculty with this information. Online SGID as a faculty development tool clearly fits this need well and is potentially effective for faculty at any stage of Chism’s faculty learning cycle.

Challenges and Opportunities

The faculty who participated in the online SGID process gained valuable insights into their own teaching, had a positive perception of the process, and expressed the desire for future participation in the online SGID program. Analysis of the adaptation of this faculty development technique to the online environment indicates that this strategy has the potential to be effective in helping faculty in any stage of Chism’s faculty learning cycle. However, although the overall approach appeared to be successful, some logistical challenges arose that made implementation of the program more difficult. These challenges led to several insights and recommendations that other faculty developers can apply to minimize potential problems in implementing a similar program at their own institutions.

Student participation in these online SGIDs was lower than anticipated: whereas students in a traditional, classroom-based SGID are a captive audience, an online, synchronous SGID can make student participation more difficult. The lower participation could be primarily attributed to the terms of the institutional review board approval for this study, which required faculty not to offer rewards or require student participation in the process as part of the course grade. Without mandating participation in the process through extra credit, a participation grade, or other incentive, faculty developers conducting an online SGID can anticipate similar low student participation in the SGID. For asynchronous courses, it may also be helpful to offer more than one time for participation in the synchronous online discussion in order to accommodate conflicting student schedules.

Faculty resistance to participation in an online SGID was an unexpected challenge. Of the sixteen faculty invited to participate in the study, only seven accepted the offer; three of the faculty who opted not to participate noted specific reasons for their refusal. One was retiring at the end of the semester, and so the need to improve teaching for future courses was not necessary. However, the other two faculty members expressed anxiety about confidentiality. Although the facilitators provided faculty with assurances of complete confidentiality, including prepared confidentiality forms approved by the university’s institutional review board, these faculty members remained anxious that negative student feedback would somehow be shared with their colleagues. Because both faculty were about to be reviewed for promotion, they declined participation. Their reaction demonstrates the need for professional development in online instruction: these faculty members-who consistently receive stellar reviews for their classroom teaching-lacked such confidence about the quality of their instruction online that they felt that student feedback had the potential to damage their professional reputations. Faculty developers implementing an online SGID program should be prepared with alternative ways to support faculty who are anxious about receiving feedback about their online teaching, such as private, confidential individual teaching consultations.

References

  • Allen I. E., & Seaman, J. (2010). Learning on demand: Online education in the United States, 2009. Retrieved from http://www.sloan-c.org/publications/survey/pdf/learningondemand.pdf
  • Bergquist, W. H., & Philips, S. R. (1977). A handbook for faculty development. Washington, DC: Council for the Advancement of Small Colleges.
  • Braskamp, L.A., Ory,J. C., & Pieper, D. M. (1981). Student written comments: Dimensions of instructional quality. Journal of Educational Psychology, 73, 65-70.
  • Carroll, R. G. (1993). Implications of adult education theories for medical school faculty development programmes. Medical Teacher 1S(2/3), 163-170.
  • Chen, Y., & Hoshower, L.B. (2003). Student evaluation of teaching effective­ness: An assessment of student perception and motivation. Assessment and Evaluation in Higher Education, 28(1), 71-88.
  • Chism, N. (2004). Using a framework to engage faculty in instructional technologies. EDUCAUSE Quarterly, 2, 39-45.
  • Clark, D. J., & Bekey, J. (1979). Use of small groups in instructional evaluation. Journal of the Professional and Organizational Development Network in Higher Education, 1, 87-95.
  • Coffman, S. J. (1998). Small group instructional evaluation across disciplines. College Teaching, 46(3), 106.
  • Creed, T. (1997). Small group instructional diagnosis. National Teaching and Learning Forum, 6(4). Retrieved from http://www.ntlf.com/html/pi/9705/sgid.htm
  • Drennan, J., Kennedy, J., & Pisarski, A. (2005). Factors affecting student attitudes toward flexible online learning in management education. Journal of Educational Research, 98(6), 331-338.
  • Gilbert, S. (1995). Teaching, learning and technology: The need for campuswide planning and faculty support services. Change, 27(2), 46-52.
  • Kinuthia, W. (2005). Planning faculty development for successful implementation of Web-based instruction. Campus-Wide Information Systems, 22(4), 189-200.
  • Millis, B. J., & Vasquez, J. (2010-2011 ). Down with the SGID! Long live the QCD! Essays on Teaching Excellence Toward the Best in the Academy, 22(4), 1-5.
  • McKeachie, W. J., & Kaplan, M. (1996, February). Persistent problems in evaluating college teaching. AAHE Bulletin, 48(6), 5-8.
  • Palloff, R. M., & Pratt, K. (2003). The virtual student: A profile and guide to working with 011/ine learners. San Francisco, CA: Jossey-Bass.
  • Richardson, J. C., & Swan, K. (2003). Examining social presence in online courses in relation to students’ perceived learning and satisfaction. Journal of Asynchronous Learning Networks, 7(1), 68-88.
  • Redmond, M. V. (1982). A process of midterm evaluation incorporating small group discussion of a course and its effect on student motivation. Washington, DC: U.S. Department of Education Educational Resources Information Center.
  • Schein, E. H., & Bennis, W. G. (1965). Personal and organizational change through group methods. Hoboken, NJ: Wiley.
  • Seaman, J. (2009). Online learning as a strategic asset. Volume II: The paradox of faculty voices: Views and experiences with cmline learning. Washington, DC: Association of Public and Land-Grant Universities.
  • Seldin, P. (1988). Evaluating college teaching. In R. E. Young & K. E. Eble (Eds.), New directions for teaching and learning: No. 33. San Francisco, CA: Jossey-Bass.
  • Seldin, P. (1993,July 10). The use and abuse of student ratings of professors. Chronicle of Higher Education.
  • Sherry, A. C., & Burke, W. F. (1995). Applying an interactive evaluation model to interactive television. In Proceedings of the 1995 Annual National Convention of the Association for Educational Communications and Technology (AECTJ. Anaheim, CA.
  • Tubbs, S. L. (1997). A systems approach to small group interaction (6th ed.). New York, NY: McGraw-Hill.
  • Watson, S. W., & Rutledge, V. C. (2005). Online course delivery and student satisfaction. (ED490363)
  • Wulff, D. H., Staton-Spicer, A. Q., Hess, C. W., & Nyquist, J. D. (1985). The student perspective on evaluating teaching effectiveness. ACA Bulletin, .n, 39-47.