The faculty peer assistants (FPAs) program combines a mentoring and peer review process for initial online faculty course development and subsequent course revision. An FPA mentors colleagues during course design and conducts peer reviews when the courses are complete. The program incorporates a peer review and evaluation form that outlines course standards and guides the faculty course developer, the peer reviewer, and the department chair. Feedback about the program from department chairs, faculty course developers, and FPAs was uniformly positive.

With the rapid growth of online courses, higher education institutions are seeking methods to ensure quality and meet accreditation standards. Standards for online courses and peer review are two common tools for quality assurance. In fall 2005, Middle Tennessee State University (MTSU), a large public institution with over twenty-five thousand students, implemented a continuous course development and quality improvement process for online courses that combines mentoring and peer review.

Mentoring and Peer Review Programs

Marek’s (2009) survey of library information science faculty in accredited master’s programs showed that faculty who develop online courses rely for support most frequently on their peers and then on the institution’s information technology workshops. Thus, Marek suggests that an effective model for online course development should include structured peer mentoring.

Several programs exist for mentoring distance learning faculty. One example is Park University’s online instructor evaluation system (Mandernack, Donnelli, Dailey, & Schultz, 2005). Online instructors are paired with a faculty evaluator who, during an eight-week term, reviews the instructor’s facilitation and delivery of the online course five times. In addition to these formative reviews, which are accompanied by peer mentoring, the academic department completes a summative review at the end of the course.

Another example of mentoring for online course development paired technically savvy graduate students with English faculty in developing online courses for the first time (Alvarez, Blair, Monske, & Wolf, 2005). The authors mention issues with the power relationships between students and faculty as a barrier, but found the program to be valuable for graduate student professional development as well as increasing the number of online course offerings.

In an in-house faculty peer review program for online courses at the School of Nursing at Indiana University (Cobb, Billings, Mays, & Canty-Mitchell, 2001), reviewers include a nursing faculty member and an instructional technology expert. Using a course checklist, the reviewers provide the instructor with a written report that outlines recommendations for improvement and hold a follow-up meeting to answer questions. The primary use of the peer review is for course improvement, since the course is evaluated after it has been taught for one year.

One of the most widely known peer review programs for online courses, Quality Matters (2006), uses a Quality Matters (QM) peer course review rubric with forty standards for online course design. The program uses an interinstitutional and disciplinary peer review team process to evaluate course design, though not delivery. A course that scores high enough on the rubric receives a QM designation.

The MTSU faculty peer assistants (FPA) program draws on other mentoring and review efforts but provides a different approach that addresses what we see as some of the weaknesses and limitations of other efforts. The program also has unique strengths that make it a potential model for other colleges and universities:

  • It is conducted in-house and across disciplines.

  • The mentoring and review are initiated during course development rather than during or after teaching.

  • The program involves the department chairs in the development and approval process.

  • The program uses a continuous improvement process that takes place at initial course development and occurs again at the revision of the course every three years.

History of the FPA Program

In 2001, MTSU established a committee and charged it with reviewing new online courses. The committee consisted of several experienced faculty course designers (FCDs) who used an abbreviated rubric and met regularly to review courses. Because of faculty schedules, the committee could not meet as often as necessary, and courses were assigned to individual committee members.

In 2003, the Distance Learning Faculty Services Office sponsored a group of twelve faculty to earn the certified online instructor (COI) designation through the Learning Resources Network (LERN). At a 2004 reorganization meeting, the review committee suggested recruiting COIs to serve as peer reviewers. Most of the initial twelve COIs agreed to serve as peer mentors and selected the name Faculty Peer Assistants Program. One of the original committee members offered a review form based on a rubric developed at the California State University at Chico (2009). The rubric was adopted by the new FPA program members and continues to be used and updated. In summer 2005, the program was piloted, and several courses were successfully reviewed.

Since the program began, forty-four MTSU faculty members have earned the COI, and approximately twenty-five serve as FPAs at any given time. Completion of COI training is a requirement for serving as an FPA, and faculty development opportunities in the areas of peer mentoring, online pedagogy, and course development are continually provided by the College of Education and Distance Learning (CEDL). MTSU faculty peer assistants receive a stipend of $125 per course assignment, which may include several reviews of the same course. At this writing, FPAs have conducted over 150 reviews.

Course Development, Review, and Approval Process

When a potential FCD expresses interest in developing a new online course, the distance learning faculty services office e-mails a course proposal, a standard syllabus template, the peer review and evaluation form (PR form), and abbreviated instructions. Links to more detailed information and training, contained in the websites of distance learning faculty services and the faculty instructional technology center, are also included.

When faculty services receives the course proposal (approved by the department chair) and syllabus, the FCD receives another e-mail that contains the name of the assigned FPA and an online course development agreement. Unlike other mentoring programs (for example, Cobb et al., 2001), assignment of the FPA is made at the beginning of course development. This is especially important for inexperienced FCDs, and it allows mentors to focus on pedagogy at the earliest stages of course development.

Consistent with the recommendation of Lottero-Perdue and Fifield (2010), the mentoring relationship between the FPA and the online faculty course developer is one-on-one. In our program, FPAs are assigned not by rank but by pedagogical experience. This results in some junior faculty mentoring senior faculty, which creates an interesting dynamic. As in the California State University teacher observation/peer support program (Webb & McEnerney, 1995), FPAs are paired with FCDs outside their department for course review. In a cross-department model, faculty can focus on pedagogy instead of being distracted by content, and the reduced likelihood of close colleague mentoring fosters greater objectivity.

When course development is complete, the FCD conducts a self-evaluation using the PR form. This self-evaluation is sent to the FPA, who reviews it and conducts his or her peer review using the same form, and then sends it back to the FCD to address recommended revisions. Depending on the nature and extent of revisions, the FPA may review those changes. When the FCD and FPA are satisfied, the PR form is forwarded to distance learning faculty services. The PR form and course approval form are then e-mailed to the department chair, who is responsible for reviewing the course and approving it for delivery. When the course approval form is signed and returned to faculty services, the scheduling center is notified that the course may be added to the semester schedule.

Although control of the FPA program rests with the faculty services office, FPAs played a major role in developing the program and its guidelines. The FPAs meet annually to review the program and recommend changes, giving them a voice in and control over the program’s continuing evolution.

Peer Review and Evaluation Form

The PR form is a rubric that guides the development and review process for new online courses, as well as those being revised on a three-year cycle. It addresses six general review areas and has been revised on an ongoing basis by FPAs. A modified version of a tool developed at California State University at Chico (2009), the rubric is based on national research regarding online learning. Reviewers rate each area using a three-point scale (LTS = less than satisfactory, S = satisfactory, and X = exemplary), can write in specific comments, and must provide an explanation for any section receiving the LTS rating. (Readers can see the rubric at www.mtsu.edu/learn/faculty/pdf/online_peer_evaluation.pdf.)

The first review area is learner support and resources. This includes such items as instructor contact information, virtual office hours, instructor response time, and emergency contact information. Resource information includes links to a variety of course-specific materials, including media·resources such as tutorials and podcasts.

The second review category is course design and organization. In this section, FPAs assess whether the course is logically constructed, the syllabus is complete, requirements are clearly defined, and information is presented clearly. In addition, reviewers rate whether pages are consistently presented to assist students with navigation, whether there is a statement about accommodations available under the Americans with Disabilities Act (ADA), and whether the course is ADA compliant (through such means as alt tags to describe in text the content of images and transcribed text for audio).

The third review topic is instructional design and delivery. In this section, FPAs rate the extent to which interaction, communication, and collaboration are incorporated into the course through opportunities for interactions among students, between students and teacher, and between students and content. They also assess whether learning objectives are identified, whether and how class activities are used, and if multiple learning styles (such as visual and tactile) are considered in the design.

Assessment and evaluation of student learning is the fourth review area. In this section, reviewers examine whether student readiness strategies for online learning are incorporated into the course (for example, a registration permit or assessment tool used at the beginning of the class). In addition, assessments are made of the extent to which course objectives, instructional strategies, and assessment techniques are aligned, as well as whether multiple and ongoing assessment strategies are used.

The next rubric topic is appropriate and effective use of technology. In this area, reviewers determine whether students are given adequate information to access their course materials, if a variety of tools are used (for example, discussion, Web pages, chat, quizzes, and blogs), and if a variety of multimedia and learning objects are used to enhance the course.

The final review topic is opportunities for and use of student feedback. Here, reviewers assess whether students are given opportunities to provide feedback about course design and navigability, feedback is integrated into course design and instruction, and opportunities are available for student self-assessment as well as peer feedback.

Once the reviewers have assessed the six major sections, they provide a summary evaluation of whether the course is ready for delivery. There are three options: (1) yes; (2) yes, with minor modification (additional review not required); and (3) no, with major modification recommended (requires additional review prior to delivery). Reviewers also confirm whether all course materials are located within the course management system and, if not, how and why the course uses other materials.

Challenges and Solutions

Several challenges have been encountered with implementing the FPA program. One area that required early improvement was the responsibility for course final approval. Since department chairs are ultimately responsible for their faculty and department, the CEDL determined that chairs should approve delivery of new online courses developed by their FCDs. Requiring chairs to review the new courses permitted them to review course content and helped build trust between them and the program.

Day-to.-day challenges with the FPA program include how to make peer reviewer assignments in an equitable manner. When the program began, e-mails were sent requesting volunteers from the FPA pool. However, some people were assigned multiple courses because they responded first to requests. In addition, colleagues were offering to review their friends’ courses. As a result, some FPAs were becoming overloaded and were not providing good mentoring service. To address these issues, a course review tracking system was developed that now allows us to make assignments based on FPA availability and the number of course reviews already assigned to them.

When a course is not recommended for delivery by the FPA or the chair, an opportunity is created for faculty development and course improvement. When the peer review results are less than satisfactory, the FPA sends an informal e-mail to the FCD addressing the concerns and providing an opportunity to make changes before the formal peer review is conducted. On the rare occasion that a course is still not acceptable, the distance learning faculty service office serves as the liaison between the FCD and the FPA to resolve the concerns. Some courses have been reviewed several times (and revisions made) before receiving FPA approval. It is important to note that although the FPAs are charged with providing honest, constructive feedback, FCDs are not required to make the recommended revisions, although most do and are grateful for the guidance. Ultimately the department chair is responsible for reviewing and approving the course for delivery. If a department chair does not approve a course, the distance learning faculty services office serves as liaison and contacts the FCD to request recommended changes. When the changes are made, the chair again reviews the course and approves addition to the semester schedule.

Another challenge occurs when the course review process is not completed in a timely manner. If an FCD has a course that is ready for review but receives no response from the assigned FPA, the faculty services office contacts the FPA. If there are problems or conflicts with conducting the review, as in one instance when an FPA had a family emergency and was unexpectedly out of state for an extended time, reassignment of the course to a different FPA is offered.

The variation in the extent to which FPAs engage in mentoring the FCD is another program challenge. When the FPA program started, it offered just peer review of the quality of online courses. As it evolved, duties extended to mentoring new FCDs. Some FPAs still perceive their duties as simply conducting peer review at the conclusion of course development. Since so many faculty members are new online course designers, FPAs are reminded, through information and training, that mentoring is an important part of the program. The mentoring component includes providing assistance during course design, answering questions and facilitating revisions during the approval process, and providing continued assistance and consultation during the first semester of course delivery.

A broader challenge was gaining buy-in from constituents. There were issues with convincing potential FCDs of the need for a formal review process. Department chairs and potential FCDs also needed to be convinced of the need for the additional levels of bureaucracy that were created. The most important program improvements (chair approval and scheduling controls) were implemented in summer 2007 and were the result of a year of meetings with administrative and faculty groups to obtain that buy-in and support.

Because the FPAs are faculty members, they are more likely to be constrained by competing job demands than if information technology staff were serving in this capacity. As Brinthaupt, Clayton, and Draude (2009) noted, faculty interested in integrating instructional technologies experience a number of technology-related barriers (the wide range of options, the pace of changes and innovations) and academic-related barriers (time and effort, tenure and promotion concerns). Many of the challenges we experienced with implementing the FPA program are directly or indirectly related to these kinds of barriers. For example, because the FCDs may want to incorporate technologies that the FPAs are not familiar with, peer mentoring can sometimes be limited in effectiveness. Or FCDs may try to incorporate too many tools into their courses, leading to neglect of pedagogical issues or a slowdown in the development process. FPAs have to find the time to conduct their reviews and must frequently work around the schedules and deadlines of the FCDs while receiving limited compensation, recognition, or credit from their departments or university. Similarly, some FCDs may not see the value of putting in large amounts of time and effort in developing their courses, resulting in less attention to best-practice principles. Thus, there has been an evolving balance among the criteria that the FPA program sets for new online courses, the amount of work the FPAs can devote to peer mentoring and review, and the expectations and preferences of the FCDs.

Our campus has developed mentor guidelines for the FPAs and offers continuous opportunities for professional development in peer mentoring. A customized workshop on peer mentoring was developed for our FPAs by LERN. In addition, new FCDs may review the program and mentors through a Meet Your Mentor site, located within the university’s course management system, detailing the FPAs’ experiences with distance learning and including personal and professional interests and information. We also include best practice resources for online course development on the site.

An important attribute of the FPA program that sets it apart from other programs is that it supports the Best Practices for Electronically Offered Degree and Certificate Programs advocated by the Commission on Colleges (2000). Some of these best practices include providing training and support to participants, ensuring that electronically offered programs and courses meet institution-wide standards (to provide consistency for students who may enroll in both electronically offered and traditional on-campus courses), maintaining appropriate academic oversight, and having academically qualified persons participate fully in the decisions concerning program curriculum and program operation.

Formal and Informal Evaluation of the FPA Program

To assess the effectiveness of the FPA program, brief surveys were sent to chairs of departments with online offerings, FCDs who had gone through the program, and the FPAs who had worked with the program. The surveys were completed online using a commercial survey program. Respondents rated a variety of statements pertaining to the operation of the program, using five-point Likert scales (1 = strongly disagree to 5 = strongly agree).

Table 13.1 presents the descriptive statistics for the department chairs’ evaluation of the FPA program. Chairs reported improved quality of online courses and increased comfort with online offerings. In addition, they reported being satisfied with their degree of involvement in the program, indicated a preference for their faculty to develop more online courses, and felt that the course development process did not need improving. These results suggest that, from the perspective of department chairs, the FPA program has been a success.

The FCDs reported having taught an average of 2.52 (SD = 3.12) online courses and having taught online for an average of 4.90 years (SD= 3.73). Descriptive statistics from the FCDs’ ratings of the FPA program are presented in Table 13.2. FCDs were uniformly favorable regarding the FPA program. In particular, they reported that the program improved the quality of their online courses and how they were taught. They were satisfied with the levels of assistance and interactivity provided by their FPA as well as with the PR form. They were more likely to develop additional online courses as a result of the program, and they disagreed that the peer review process needs to be improved.

Table 13.1 Department Chairs’ Evaluation of the FPA Program
VariableMeanSD
The CEDL online course development process has improved the quality of the courses my faculty have created.4.23.599
The CEDL online course development process has made me more comfortable about my department offering additional online courses.4.08.900
As a chair, I am satisfied with my degree of involvement in the CEDL online course development process.4.15.555
In my department, the oversight for online course development is more extensive than the oversight for face-to-face course development.2.851.463
I would like more of my faculty to develop and offer online courses.3.771.166
The quality of the course development process needs to be improved.2.31.751
Note: N = 13; response rate = 50 percent.
Table 13.2 FCDs’ Evaluation of the FPA Program
VariableMeanSD
The CEDL faculty peer review process has improved the quality of the online course(s) I developed.4.081.06
The CEDL faculty peer review process has improved how I teach the online course(s) I developed.3.681.10
I am satisfied with the level of assistance provided by my FPA.4.061.07
The level of interactivity with my FPA was adequate.3.901.05
My experience with the peer review process makes it more likely I will develop additional online courses.3.751.20
The CEDL peer review evaluation form was helpful in the development of my online course.3.861.21
The quality of the faculty peer review process needs to be improved.2.431.14
Note: N = 52; response rate = 37 percent.

The FPAs reported having conducted an average of 8.67 (SD = 6.69) course reviews to date and having been teaching online for an average of 8.50 (SD = 3.03) years. Table 13.3 presents the evaluative data from the FPAs. Similar to Gibson’s (2004) finding of a reciprocal relationship in mentoring, FPAs reported several benefits from participating in the program. Informal comments from the FPAs and the FPA survey results suggest that both the mentor and the mentee receive benefits. In particular, mentors noted that mentoring provided them with ideas to improve their own online teaching, think more critically about their own online courses, and focus on course development best practices. They also reported that working with faculty from other disciplines was beneficial and that the PR Form was helpful.

In summary, department chairs, FCDs, and FPAs rated the program as effective and successful. There were no strong feelings from any of the target groups that the program needed to be improved. Both the FCDs and FPAs reported that participating in the program has improved their own online teaching and course development.

Table 13.3 FPAs’ Evaluation of the FPA Program
VariableMeanSD
Mentoring has provided me with ideas that I have used to improve my online teaching.4.00.94
Mentoring has helped me deal more productively with the challenges of teaching online.3.90.99
Mentoring has helped me think more critically about my own online courses.4.10.99
Mentoring has helped me focus on best practices in course development.4.10.99
My own teaching has benefited from working with faculty from other disciplines.4.20.63
The stipend I am paid for conducting a course review is sufficient, given the time and effort I typically expend.3.10.74
The CEDL peer review evaluation form is helpful in the mentoring process.4.67.50
The quality of the faculty peer review process needs to be improved.2.801.14
Note: N = 10; response rate = 44 percent.

In addition to the survey data, we have information about the program’s benefits through informal feedback from department chairs, FCDs, and FPAs. Comments from department chairs indicated that they value the fact that they are involved from the beginning of the course development process to the end (by approving the course development agreement and by approving final development and delivery). Comments from FCDs indicate that they appreciate learning more about the pedagogy behind online instruction, viewing other perspectives on the use of technology, seeing how others have solved issues related to the challenges of teaching online content, and having the opportunity to work with seasoned online instructors throughout the development process. Comments from FPAs indicate that they appreciate the emphasis on best practices and being able to see things from the student perspective.

The systematic review of online courses has larger benefits that could be considered for expansion to the design of traditional courses. Using a standard syllabus template provides students with consistent information about library services for distance learners, services for disabled students, technical support information, online tutoring services, and the distance learning testing center. Improved course quality is an additional benefit to the students. FPAs have made suggestions for improved course navigation, organization, and delivery. Department chair reviews have also improved course quality. For example, one course did not include the approved departmental learning outcomes. The chair recommended this change, and the course was revised accordingly. Although our campus has yet to consider this possibility, the development and implementation of peer review and evaluation programs for traditional courses, modeled after the one discussed in this chapter, might be an innovative and effective next step.

Although the institution has implemented many strategies to improve student outcomes in online courses, we believe that the FPA program has been one of the most significant. In a recent study of 328 students surveyed by the distance learning office, the majority of students indicated satisfaction with online courses (84 percent stating they were satisfied or very satisfied with their course), with 91 percent of respondents stating they would take another online course. Compared to traditional (face-toface) MTSU courses, 60 percent said they learned the same amount, 25 percent said they learned more, and 15 percent said they learned less.

The strength of the FPA program is that it benefits the institution, mentee, and mentor. The institution benefits by socializing faculty into the distance learning community, meeting accreditation standards, and providing students with improved courses. Mentees and mentors benefit by learning effective online teaching strategies from each other. Most departments at MTSU have only a few faculty members who develop and teach online courses. Similar to the goal of the TOPS program at California State University (Webb & McEnerney, 1995), the FPA program strives to reduce the isolation of teaching. Teaching and learning centers can also reduce the isolation of teaching; however, these centers typically focus on broad teaching issues and the use of technology in teaching in the classroom. MTSU’s learning, teaching, and innovative technologies center (LT&ITC) offers workshops that bring together faculty for one-time discussions of using technology in teaching; however, the participants may or may not be teaching online courses. Bringing together isolated faculty who share a common interest in online teaching is a benefit of the FPA program that is not addressed consistently by our LT&ITC. However, both programs are important in recruiting and developing online faculty course developers on campus.

Institutions that would like to implement a similar program should have academic affairs support both in resources and enforcing guidelines for online course development. Faculty buy-in is essential. Although we feel that the FPA program works well, other institutions might consider some alternative approaches. The survey results suggested that the FPAs were rather ambivalent about the compensation for their work, so reducing that amount would probably not be a good idea for this program. Instead of providing FPAs an external certification, an internal program could be developed by the teaching and learning center or information technology department on the campus. In lieu of providing FPAs with a stipend, the institution could allow their service to count as their committee work. Alternatives to paying faculty a development fee for online course development are assigning graduate students to assist faculty or providing faculty release time. At some institutions, a course fee assessed to online courses could offset the cost of online course development.

Our experience with the development and evolution of the FPA program suggests that it has provided many benefits. Students receive a more consistent course experience, which we believe has improved satisfaction and retention. Faculty connect to others in the distance learning field, develop and offer improved online courses, and provide better service to students and the university. Department chairs participate in the program from beginning to end and feel they have some control over what their faculty and department are offering. Giving department chairs a rubric based on best practices with which to evaluate the course has served to educate administrators about effective pedagogy, online and otherwise. Having FPAs residing in the departments provides their colleagues with a readily available source of information about online learning and the processes of online course development.

In summary, the FPA program innovatively and creatively uses limited institutional resources to improve the development and quality of online course offerings. By tapping into the knowledge and experiences of faculty mentors, the program guides course designers, reviewers, and administrators toward the incorporation of best practices.

References

  • Alvarez, D. M., Blair, K., Monske, E., & Wolf, A. (2005). Team models in online course development: A unit-specific approach. Educational Technology and Society, 8(3), 176–186.
  • Brinthaupt, T. M., Clayton, M. A., & Draude, B. J. (2009). Barriers to and strategies for faculty integration of IT. In P. Rogers, G. Berg, J. Boettcher, C. Howard, L. Justice, & K. Schenk (Eds.), Encyclopedia of distancelearning (2nd ed., Vol. 1, pp. 138–145). Hershey, PA: IGI Global.
  • California State University at Chico. (2009). Rubric for online instruction. Retrieved from www.csuchico.edu/tlp/resources/rubric/rubric.pdf
  • Cobb, K. L., Billings, D. M., Mays, R. M., & Canty-Mitchell, J. (2001). Peer review of teaching in web-based courses in nursing. Nurse Educator, 26(6), 274–279.
  • Commission on Colleges, Southern Association of Colleges and Schools. (2000). Best practices for electronically offered degree and certificate programs. Retrieved from www.sacscoc.org/pdf/081705/commadap.pdf
  • Gibson, S. K. (2004). Being mentored: The experience of women faculty. Journal of Career Development, 30(2), 173–188.
  • Lottero-Perdue, P. S., & Fifield, S. (2010). A conceptual framework for higher education faculty mentoring. In L.B. Nilson & J.E. Miller (Eds.), To improve the academy: Vol. 28. Resources for faculty, instructional, and organizational development (pp. 37–62). San Francisco: Jossey-Bass.
  • Mandernack, B. J., Donnelli, E., Dailey, A., & Schultz, M. (2005). A faculty evaluation model for online instructors: Mentoring and evaluation in the online classroom. Online Journal of Distance Learning Administration, 8(3), 1–30.
  • Marek, K. (2009). Learning to teach online: Creating a culture of support for faculty. Journal of Education for Library and Information Science, 50(4), 275–292.
  • Quality Matters. (2006). Quality matters: Inter-institutional quality assurance in online learning: A grant project of MarylandOnline: Summary. Retrieved from www.qualitymatters.org/documents/final%20FIPSE%20Report.pdf
  • Webb, J., & McEnerney, K. (1995). The view from the back of the classroom: A faculty-based peer observation program. Journal on Excellence in Teaching, 6(3), 145–160.