Abstract

For faculty development events to have the greatest impact on campus practice, faculty developers need to attract and include as many faculty members as possible at their events. This article describes the testing of a checklist regarding faculty attendance at professional development events through a survey of 238 faculty members at small colleges in the United States. The results demonstrate the influence of social relationships upon faculty attendance at teaching and learning events, the difficulties of scheduling such events, and motivational differences between full-time and adjunct faculty. The use of food as a motivator for attendance is also appraised. The relative fit of the data to the 5 attributes of E. Rogers (2003) and the 4 motives of Wergin (2001) is discussed.

Keywords: faculty development, event planning, faculty governance, part-time faculty, adjunct faculty, sessional faculty

In the classic movie, Casablanca (1942), Captain Renault tells his officers to “Round up the usual suspects.” Likewise, faculty developers often refer to the core group of faculty who regularly attend teaching and learning events as “the usual suspects.” To have the greatest impact on campus, however, faculty developers must attract as many faculty members as possible, beyond the usual suspects, in order to widely broadcast effective teaching practices and to facilitate campus wide participation in the Scholarship of Teaching and Learning (SoTL).

The impact of faculty development initiatives is often framed in terms of participation (Medina, Garrison, & Brazeau, 2010). How many faculty members get involved? How frequently, how intensely, and over what duration? The vexing question is this: how can Centers for Teaching and Learning (CTL) increase faculty attendance, thereby building stronger social networks and increasing the chances that a valued practice will take root in the faculty community? In order to improve outreach efforts by faculty developers, this study aims to identify and rank the various reasons that faculty members might attend such events.

Wergin, Rogers, and Faculty Engagement

When choosing initiatives and offerings, faculty developers become community organizers. They act as cultural agents in their efforts to select and promote teaching practices and their associated ideas and values. For example, Lewis (1991) explores various theories of attitude change, arguing that faculty developers can leverage such theories in persuading faculty to attend events and participate in initiatives. Similarly, Zahorski (1993) described the ways in which faculty developers assume leadership roles in effecting institutional change, in particular highlighting the way that developers can indirectly increase faculty engagement by advocating for organizational changes such as revised reward systems and recruitment processes. Most importantly, however, Zahorski argues for attitudinal change: “Attitudinal change is vital in that it serves as the foundation for all other significant organizational, curricular, and instructional changes on campus” (p. 237). Attitudes directly influence the degree to which any organizational, curricular, or instructional changes will be successful.

A widely adopted framework for understanding how innovation can be spread in organizations is provided by Everett Rogers’s concept of the diffusion of innovation. Rogers (2003) states, “An innovation is an idea, practice, or project that is perceived as new by an individual or other unit of adoption” (p. 12). Faculty development initiatives, while not necessarily new or innovative in the broader sense, as they may consist of ideas that have been around for some time, can nevertheless represent innovation in the context of a campus community that is encountering that idea or practice for the first time.

Rogers (2003) breaks down the diffusion process into five familiar adopter categories (innovators, early adopters, early majority, late majority, and laggards), and offers five interdependent, “perceived attributes” of an innovation: the degree to which it offers advantages, is compatible enough with familiar practices, is relatively simple to adopt, is open to trial attempts, and is visible in its importance (p. 223).

While Rogers (2003) describes the spread of innovation based on the qualities of innovation itself, Wergin (2001) describes campus participation in innovation from the perspective of the faculty’s motivations. Faculty motives for attending events on teaching and learning, argues Wergin (2001), are related to their motives for entering the professoriate in the first place. Citing “more than forty years of research on faculty motivation,” he offers four key motives: autonomy (the freedom to pursue personally meaningful problems and questions), community (the opportunity to be among kindred spirits), recognition (the desire to be valued), and efficacy (the sense of having made a difference). Just as faculty developers often advocate that faculty adopt a student centered approach, Wergin advocates for a faculty centered approach to faculty development.

In addition to its overlap with Wergin, Rogers’ theory informs several studies of educational innovation, particularly those that explore factors affecting adoption of instructional technology. Dooley (1999) and Sahin (2006) offer reviews of studies focused on educational technology adoption. Based on her doctoral research, Moser (2007) developed a model that explains “a faculty educational technology adoption cycle,” highlighting barriers to adoption. Mullinix (2007) draws on Rogers’ research and theory to focus attention on faculty as agents of change. She focuses on faculty as potential “early adopters” who, when nurtured and recognized, can positively affect attitudes and practices of their peers. Rogers (2003) states, “The diffusion of innovations is essentially a social process in which subjectively perceived information about a new idea is communicated from person to person” (p. xx). Guided by Rogers, Mullinix suggests that faculty developers should nurture “home grown innovations and technology diffusion” by creating a highly visible “series of formal and informal networks, events and communication strategies [that] can create opportunities for dialogue … [thereby] positioning faculty as opinion leaders” who can help “move innovations to the point of critical mass” (p. 2). Like Mullinix, Fullan (2005) not only underscores the power of social networks but also echoes Wergin’s emphasis on the value of autonomy: “Assume that any significant innovation, if it is to result in change, requires individual implementers to work out their own meaning …. Assume that people need pressure to change (even in directions that they desire), but it will be effective only under conditions that allow them to react, to form their own position, to interact with other implementers, to obtain technical assistance” (p. 109).

Together, Wergin and Rogers offer us perspectives on the motives that drive faculty openness to “innovation”—openness to those practices and ideas that many faculty development centers want to bring to “critical mass.” Increasing faculty attendance at development events is a crucial step in building a dynamic social network supporting the adoption of valued practices. When planning a workshop or initiative, faculty developers can use perspectives offered by Wergin and Rogers to assess the relative promise of such efforts, in effect using a checklist aimed at maximizing faculty connection with others engaged in an “innovative” practice.

Creating a Checklist for Faculty Attendance

Drawing upon both Wergin and Rogers, we initially constructed a checklist that framed faculty development planning as an attempt to improve faculty engagement and attendance (see http://tinyurl.com/mr4ts75). In thinking about a planned faculty development activity as an “innovation,” we asked, what might be its relative advantages to faculty? Is it compatible enough with practices that faculty know? Does it present sufficiently interesting complexities and challenges? Is there a convenient means of experimentation, followed by a truly practical, timely path to expertise? And finally, will there be opportunities for visibility and recognition? Framed within the categories suggested by Rogers, the checklist included faculty centered questions inspired by Wergin. What are the fascinating, disorienting, exciting challenges in teaching? With whom can one explore these challenges? Will one’s engagement benefit others and bring some meaningful attention to one’s effort? In practice, the checklist provided a means by which faculty developers could think about the sorts of questions that faculty members ask themselves when presented with an opportunity for faculty development, and therefore improve their ability to plan successful opportunities that meet faculty needs.

Method

After constructing this checklist and sharing it for comments and modification at the POD 2013 conference, we translated the list into a 5 point Likert survey to test if the questions did indeed correspond to faculty motives for attending teaching and learning events and initiatives. Each checklist item was rewritten as a single survey item. The survey connects to the work of Rogers and Wergin by adopting the categories and terms they offer in order to pinpoint and rank the reasons for faculty to attend events and participate in initiatives. We renamed Rogers’ attribute of “Complexity” to “Simplicity,” to avoid an inverse relationship in the responses. To control for sequence and carryover effects (Bradburn, D’Andrade, Rasinski, & Tourangeau, 1989; McFarland, 1981), we disaggregated the survey items from Rogers’ five attributes, and presented them instead as a single intermingled list. Because we were most interested in the behavior of faculty at small colleges, the survey was distributed online to listservs for the New England Faculty Development Consortium (NEFDC) and for the Small Colleges Committee (SC POD) of the Professional and Organizational Development Network (POD). We requested that members of those listservs (primarily faculty developers) distribute the surveys at their colleges. Faculty from 12 colleges responded, for a total of 238 responses. All responders were from small colleges, none with more than 377 faculty members (full and part time).

Content validity was established through the sharing of the checklists for comments and modification, and through mapping survey items to Rogers’ framework of attributes. Reliability was not directly assessed.

Results

The survey received 238 responses (see Table 1 for a summary of the participants by faculty status and institutional variables). The largest respondent groups by faculty status were tenured or tenure track full time faculty (55%, n = 131) and non unionized part time faculty (18.9%, n = 45) represented the majority of respondents, with a smaller representation of full time unionized, non tenure, and/or staff with teaching responsibilities. The highest degrees granted by participant institutions represented all academic degrees, with the greatest representation from doctorate granting (47.1%, n = 113) and bachelors granting (36.3%, n = 87) institutions. Participants also reported the geographic location of their institutions. Respondents represented three out of four Census Bureau—designated US regions (U.S. Department of Commerce, n.d.), with all but two participants drawn from northeastern (45%, n = 107) and Midwestern (54.2, n = 129) states. One participant reported an affiliation with a fully online institution.

Table 1. Participant Demographics
Demographic TraitFrequency (f)Percentage (%)
Instructor status
 Faculty, full time, tenured8937.4
 Faculty, full time, tenure track4217.6
 Faculty, full time, non tenured156.3
 Faculty, full time, unionized198.0
 Faculty, full time, union track52.1
 Faculty, part time, unionized10.4
 Faculty, part time, non unionized4518.9
 Staff with teaching responsibilities229.2
238100.0
Highest degree granted by institution
 Associates177.1
 Bachelors8736.3
 Masters218.8
 Doctorate11347.1
238100.0
Location
 Northeastern United States10745.0
 Midwestern United States12954.2
 Southern United States00.0
 Western United States10.4
 Online only10.4
238100.0

The survey items were found to be highly reliable (46 items; α = .901). Table 2 provides a list of mean responses by question, in the order they were provided in the survey.

Table 2. Participant Responses to Survey Questions
ItemMSD
1A friend will be presenting.1.050.59
2After the event, I will have access to ongoing, easily accessed support on this topic.0.640.58
3I am strongly opposed to this topic.−0.810.89
4I find the topic interesting.1.110.44
5I have heard success stories about this topic.0.890.55
6I will receive a certificate of participation for my dossier.0.240.67
7I will receive a letter of recognition for my dossier.0.250.68
8I would need to make extensive changes to implement this topic.−0.360.75
9I would need to make only a few changes to implement this topic.0.420.65
10My chair or supervisor is attending the event.0.270.53
11My continued involvement may lead to recognition as an “Advisor” on this topic to other faculty.0.380.63
12My continued involvement may lead to recognition as a “Faculty Fellow” in this topic/initiative.0.350.65
13My dean is attending the event.0.700.56
14My friends are attending the event.0.630.56
15My involvement can be included in my promotion portfolio.0.520.66
16My involvement is scalable. I can grow at my own pace.0.610.58
17My involvement will allow me to grow at my own pace.0.610.58
18My involvement will be advertised by the administration.0.120.67
19My involvement will be appreciated by my peers.0.740.53
20My involvement will have a positive impact.1.080.49
21My involvement will help me grow as a teacher.1.250.50
22My involvement will make me a part of a larger community on campus.0.820.59
23Participation in this event will fit easily into my schedule.1.250.49
24People from my department are attending the event.0.630.61
25People from my discipline are attending the event.0.680.60
26People I respect are attending the event.0.790.54
27Someone I respect helped to organize the event.0.970.50
28Someone I respect will be participating actively in the event.0.930.54
29Someone I respect will be presenting.1.030.50
30The event includes a snack.0.270.49
31The event includes dinner.0.250.65
32The event includes lunch.0.380.62
33The President is attending the event.0.230.58
34The topic fits the institutional narrative.0.460.60
35The topic has been included at the President’s request.0.380.65
36The topic is a natural extension of my own practices.1.060.58
37The topic is clearly defined and has a manageable set of steps for involvement.0.860.51
38The topic is easy to adopt.0.790.53
39The topic is easy to understand.0.700.57
40The topic of this event is a good fit with my beliefs.0.870.54
41The topic of this event is part of an institutional agenda.0.600.66
42The topic of this event originated in my department.0.550.52
43The topic resonates with my discipline.1.050.50
44The topic will address a problem the campus has faced for several years.1.010.56
45The topic will serve to advance the common good on campus.1.030.55
46There will be opportunities for me to experiment with the initiative or topic before deciding if it is right for my classes.0.870.56
Note. Responses are on a 5 point Likert scale (−2, −1, 0, 1, 2), with 0 indicating a neutral response.

One way analyses of variance (ANOVAs) were used to test for preference differences by (a) faculty rank, (b) full time status, (c) union status, (d) highest degree granted by participant institution, and (e) geographic region. No differences were found by degree granted or geographic region. Part time faculty reported a higher degree of interest than full time faculty in events that include a snack, F(1, 236) = 9.32, p = .003, lunch, F(1, 236) = 6.90, p = .009, or dinner, F(1, 236) = .040, p = .040, or if their supervisor would be attending, F(1, 236) = 10.66, p = .001, or if the event would lead to recognition as an advisor on the topic, F(1, 236) = 6.61, p = .011. In regard to tenure status, participants who were tenured or in tenure track positions were more likely than non tenure track faculty to attend if the topic would address a problem that the campus has faced for several years, F(1, 236) = 4.12, p = .043. Non unionized participants were more likely than unionized participants to attend if the topic originated in their department, F(1, 236) = 4.21, p = .041, or if someone they respected would be presenting, F(1, 236) = 5.48, p = .02.

To explore the possibility that a preponderance of survey respondents were specifically those individuals who do attend faculty development regularly, the same series of one way ANOVAs were applied to a subset of the data consisting only of respondents from institutions where at least 20% of faculty members (full and part time) had completed the survey, indicating that participants represented more than a few highly engaged faculty on that campus. This reduced the respondents studied to a total of 78 from two colleges. Results mirrored those of the full data set in items that were statistically significant, with the following exceptions: (a) no significant difference was found between unionized and non unionized participants’ likelihood to attend if the topic originated in their department, and (b) faculty at these institutions reported that they were more likely to attend if the president were going to be in attendance, F(1, 157) = 4.41, p = .04, or had specifically requested the topic, F(1, 157) = 8.47, p = .004.

Discussion

Almost all of the items in the survey demonstrated positive relationships with faculty members’ intentions to attend teaching and learning events, though some of these relationships were quite small. There were a few significant differences between full time tenured faculty and part time faculty that emerged. The most important findings were about the correlations between attendance and social relationships, food, and time requirements. Difficulties with the Rogers model as an organizer for this work were also noted.

Many faculty developers insist that “if you feed them, they will come.” Our data suggest otherwise. Providing food at an event was shown to be a weak motivator at best, with lunch providing a small boost in participant interest (M = 0.38, SD = 0.62), and snacks (M = 0.28, SD = 0.49) or dinner (M = 0.25, SD = 0.62) ranked as even less important. Part time faculty (M = 0.52, SD = 0.72) rated food as slightly more important to their decision making process than did full time faculty (M = 0.33, SD = 0.56). Jones (2007) hypothesized that food has a ceremonial importance on campus. “Breaking bread” with one’s colleagues, he argued, breaks down social barriers, but although teaching and learning events are important social events on campus, none of the participants emphasized the social or ceremonial role of food (Briggs, 2007, p. 689). As one respondent noted, “if the event occurs over a mealtime, then I would be more likely to attend if a meal is provided. But food is not the draw—just necessary at certain times of the day.” Edward Nuhfer (2014) commented on the findings, explaining, “One reason food works at events is because lunch is the only hour most faculty have free, although even that is getting usurped by committees these day.” Scheduling issues are one of the most important motivators for attendance (M = 1.25, SD = 0.49), so linking mealtimes and teaching and learning events means that although food is not a strong motivator, it is tied to scheduling, which is. Requiring faculty members to preregister for an event may also lead them to be more likely to attend on the day of the event, out of a sense of obligation. In addition, while food may not be an important motivating factor, the lack of food could be an inhibiting factor. Schifter (2000) demonstrated that different sets of motivational factors can arise when survey participants are asked to identify both motivating and inhibiting factors, something that was not part of this survey (Schifter, 2000, pp. 17–18). Abes, Jackson, and Jones (2002) investigated the adoption of service learning by faculty members, but did so by investigating only potential deterrents to change. Nevertheless, they discovered through qualitative responses that faculty members who adopted service learning did so largely from intrinsic reasons.

A number of motivating factors are identified by our highest rated survey choices. Faculty members showed interest in attending if the event resonates with personal interest, personal practices, or with their discipline. More encouragingly, they indicated they would also attend if the topic either advanced the common good on campus or if it addressed a problem the campus has faced for several years. Finally, they would attend if a friend or someone they respect is attending. Table 3 provides a summary of these highest rated items. The present results also suggest that the identity of the presenter, and of other attendees, serves as an important factor when faculty members are deciding whether to attend faculty development events (see Table 4). A clear hierarchy emerged regarding which people influenced this decision the most. Attending an event where a friend is presenting appears to be most important, followed by presentations by people whom the faculty member respects. The presence of people from one’s discipline, and then from one’s department, are the next highest draws. Non unionized faculty are more likely to be interested in topics that originated in their department, while tenured or tenure track faculty members are more likely to attend if the event or initiative addresses a broader campus problem. Finally, while the presence of administrators such as chairs, presidents, and deans does serve as a positive influence, it does so just barely. Within the realm of administrative influence, part time faculty are more influenced by the presence of their supervisors, and by the potential to earn a title of “advisor’ on a topic, than are full time faculty.

Table 3. Highest Rated Survey Items
ItemM
My involvement will help me grow as a teacher.1.25
Participation in this event will fit easily into my schedule.1.25
I find the topic interesting.1.13
My involvement will have a positive impact.1.08
The topic is a natural extension of my own practices.1.06
A friend will be presenting.1.05
The topic resonates with my discipline.1.05
The topic will serve to advance the common good on campus.1.03
Someone I respect will be presenting.1.03
The topic will address a problem the campus has faced for several years.1.01
Table 4. People Who Influence Faculty to Attend
ItemM
A friend will be presenting.1.05
Someone I respect will be presenting.1.03
Someone I respect helped to organize the event.0.97
Someone I respect will be participating actively in the event.0.93
People I respect are attending the event.0.79
My involvement will be appreciated by my peers.0.74
People from my discipline are attending the event.0.68
People from my department are attending the event.0.63
My friends are attending the event.0.63
My chair or supervisor is attending the event.0.27
The President is attending the event.0.23
My dean is attending the event.0.17

The survey’s findings are limited in that they include only self reported data. Though faculty members identify the involvement of upper administration as least important to faculty development, we note that attendance at events at our institutions that are supported and attended by the upper administration also have a high attendance by faculty members. For that reason, this survey should be viewed as identifying what faculty members will choose to do of their own volition, which can be useful to bottom up strategies of faculty development. Traditional management techniques obviously still play a role in attendance at events, and it is likely that both methods motivate change in the faculty. Bottom up strategies encourage faculty buy in, while top down strategies demand it.

Schifter (2000) is the clearest example of a study that anticipates this difference between top down and bottom up strategies of faculty development. Schifter researched motivating and inhibiting factors for faculty participation in asynchronous learning networks. Administrators in that study did a poor job predicting motivating factors for the faculty, identifying only one of the five factors identified by the faculty themselves (i.e., personal motivation to use technology). Administrators tended to emphasize extrinsic motivators (monetary support for participation, credit toward promotion, and tenure and release time), whereas faculty members tended to emphasize intrinsic motivators (opportunity to develop new ideas, opportunity to improve my teaching, opportunity to diversify program offerings, and greater flexibility for students). Administrators did much better when asked to identify inhibiting factors. Administrators and faculty members reported identical inhibiting factors four out of five times (lack of technical support provided by the institution, lack of release time, concern about faculty workload, and lack of grants for materials/expenses). In addition, faculty members cited concern about quality of courses while administrators anticipated resistance based on the lack of merit pay. This suggests that motivating factors are more important to bottom up strategies, while removing inhibiting factors is more important to top down strategies. Dillon and Walsh (1992) likewise found that faculty motivation to teach distance education courses depended largely on intrinsic rather than on extrinsic motivators. Finally, Mannix (2012) surveyed faculty as to their motivating factors to teach and discovered that management and superiors made up 34.5% of the demotivating factors identified by the surveyed faculty.

Still, the presence of friends or colleagues does not necessarily mediate attendance when a faculty member holds a strong negative opinion of a given topic. Survey respondents indicated that they were very unlikely to attend an event where they were strongly opposed to the topic (M = −0.81, SD = 0.89), perhaps because doing so would be aversive, or perhaps because they saw no reason to spend further time on it, even if the event offered social interaction with their peers.

Faculty prefer events that fit easily into their schedules and ones that do not require extensive changes to their instruction. In the items, “Participation in this event will fit easily into my schedule” and “I would need to make extensive changes to implement this topic,” participants indicated they were less likely to attend topics that would require extensive changes (M = −0.36, SD = 0.75), and more likely to attend events that would fit easily into their schedules (M = 1.25, SD = 0.49).

The question of where faculty members choose to invest their professional time was another area highlighted by this survey. Faculty members have full schedules, and their need to maximize time efficiency can have an important impact on their choices. For example, Sorenson and Bothell (2004) showed that time intensive changes to a course are rarely implemented, even if the faculty members believe them to be more effective. Using National Survey of Student Engagement (NSSE) data for Brigham Young University (BYU), the researchers demonstrated a near inverse relationship between faculty methods of teaching and faculty beliefs about teaching. Faculty in their study used a variety of assessment methods (e.g., exams, class presentations, quizzes, and attendance) that they believed to be less effective in assessing student learning, while using methods that they believed would be most effective as assessment tools (e.g., field work, capstone projects, performances, and portfolios) much less frequently. The reason for this pattern was not laid out by Sorenson and Bothell, but one possible explanation seems clear. Assessing student achievement using capstone projects or portfolios is much more time intensive than the less effective types of assessment that faculty members more frequently employed. Given the need to balance their overall job responsibilities with the time it might take to assess students in a more meaningful way, time management may well have become more important than accurately measuring student learning.

Like all studies, the results of this survey must be considered with a number of limitations in mind. Given past experience with the “usual suspects,” it is very possible that the same thing has happened with this survey, and that the faculty members who care most about faculty development have responded more than others. As such, the pattern of responses may be better representative of more engaged faculty, than of faculty as a whole. In an attempt to control this possible bias, we have used data only from two of the schools surveyed, both of which had a response rate of over 20%. In future versions of this survey, a question should be added identifying the mindset of the respondent, to investigate that potential bias.

While the conceptual frameworks offered by Rogers and Wergin continue to be of interest, the high proportion of positive responses to all survey items, combined with internal overlap between the attributes, made it difficult to contextualize the results specifically within these frameworks. For example, overlap occurs because “Advantage” includes both financial advantage and status giving, while status giving in academia seems closely tied to “Observability.” Factorial analyses of the data did not group survey items in accordance with either Rogers or Wergin – nor indeed, was there any discernible pattern that might be used to provide an alternative framework. Future survey versions will need to more strongly establish survey validity and reliability, and take any areas of overlap more clearly into account, if they are to provide a clear conclusion in this regard. As a step in this direction, Table 5 provides a rewritten checklist for faculty attendance that organized by Wergin’s motives, which provides a model without overlap between potential survey items.

Table 5. Faculty Member’s Checklist for Deciding to Attend an Event (Version 2), Organized in Accordance With Wergin (2001)
DomainSurvey ItemAverage
EfficacyMy involvement will help me grow as a teacher.1.25
Participation in this event will fit easily into my schedule.1.25
My involvement will have a positive impact.1.08
I have heard success stories about this topic.0.89
There will be opportunities for me to experiment with the initiative or topic before deciding if it is right for my classes.0.87
The topic is clearly defined and has a manageable set of steps for involvement.0.86
The topic is easy to adopt.0.79
The topic is easy to understand.0.70
After the event, I will have access to ongoing, easily accessed support on this topic.0.64
My involvement is scalable. I can grow at my own pace.0.61
My involvement will allow me to grow at my own pace.0.61
I would need to make only a few changes to implement this topic.0.42
I would need to make extensive changes to implement this topic.−0.36
AutonomyI find the topic interesting.1.13
The topic is a natural extension of my own practices.1.06
The topic resonates with my discipline.1.05
The topic of this event is a good fit with my beliefs.0.87
The topic of this event originated in my department.0.55
I am strongly opposed to this topic.−0.81
CommunityA friend will be presenting.1.05
The topic will serve to advance the common good on campus.1.03
Someone I respect will be presenting.1.03
The topic will address a problem the campus has faced for several years.1.01
Someone I respect helped to organize the event.0.97
Someone I respect will be participating actively in the event.0.93
My involvement will make me a part of a larger community on campus.0.82
People I respect are attending the event.0.79
My involvement will be appreciated by my peers.0.74
People from my discipline are attending the event.0.68
People from my department are attending the event.0.63
My friends are attending the event.0.63
The topic of this event is part of an institutional agenda.0.60
My involvement can be included in my promotion portfolio.0.52
The topic fits the institutional narrative.0.46
The topic has been included at the President’s request.0.38
The event includes lunch.0.38
My chair or supervisor is attending the event.0.27
The event includes a snack.0.27
The event includes dinner.0.25
The President is attending the event.0.23
My dean is attending the event.0.17
RecognitionMy continued involvement may lead to recognition as an “Advisor” on this topic to other faculty.0.38
My continued involvement may lead to recognition as a “Faculty Fellow” in this topic/initiative.0.35
I will receive a letter of recognition for my dossier.0.25
I will receive a certificate of participation for my dossier.0.24
My involvement will be advertised by the administration.0.12
Note. See Table 2 for standard deviations.

Conclusion

Constructing faculty development events that appeal to a broad audience is not easy. The results of this survey provide new and supporting information regarding which elements of a topic or initiative are most important to faculty and instructional staff, while also helping us understand faculty thinking a bit more deeply. The chief lesson is that faculty care about their personal relationships more than their professional ones, so faculty developers would be well advised to understand the social relationships of the faculty and be able to identify the various cliques within their institution. If there is a desire for faculty members to become leaders on campus, tenure track positions seem more successful than unionization. Food is not very important for events, despite its role as a part of ceremony, celebration, and the easing of social boundaries. Finally, part time faculty members are more conscious of attention to their behavior by supervisors than are full time faculty.

References

  • Abes, E., Jackson, G., & Jones, S. (2002). Factors that motivate and deter faculty use of service learning. Michigan Journal of Community Service Learning, 9(1), 5–19.
  • Bradburn, N., D’Andrade, R., Rasinski, K. A., & Tourangeau, R. (1989). Carryover effects in attitude surveys. Public Opinion Quarterly, 53(4), 495–524.
  • Briggs, C. (2007). Curricular collaboration: A key to continuous program renewal. The Journal of Higher Education, 78(6), 676–711.
  • Dillon, C. L., & Walsh, S. M. (1992). Faculty: The neglected resource in distance education. The American Journal of Distance Education, 6(3), 5–21.
  • Dooley, K. (1999). Towards a holistic model for the diffusion of educational innovations: An integrative review of educational innovation studies. Educational Technology & Society, 2(4), 35–45.
  • Fullan, M. (2005, November). 10 do and don’t assumptions about change. The Learning Principal, 1(3). (Reprinted from The new meaning of educational change, 3rd ed., pp. 108–110, by M. Fullan, 2001, New York, NY: Teachers College Press).
  • Jones, M. O. (2007). Food choice, symbolism, and identity: Bread and butter issues for folkloristics and nutrition studies (American Folklore Society Presidential Address, October 2005). The Journal of American Folklore, 120(476), 129–177.
  • Lewis, R. F. (1991). How attitudes change: A primer for faculty developers. To Improve the Academy, 10, 35–46.
  • Mannix, V. (2012). Exploring the concept of teaching faculty motivation. International Journal for Cross Disciplinary Subjects in Education (IJCDSE), 2(1), 911–918.
  • McFarland, S. G. (1981). Effects of question order on survey responses. Public Opinion Quarterly, 45(2), 208–215.
  • Medina, M. S., Garrison, G. D., & Brazeau, G. A. (2010). Finding time for faculty development. American Journal of Pharmaceutical Education, 74(10), 1–2.
  • Moser, F. (2007). Faculty adoption of educational technology. Educause Quarterly, 30(1), 66–69.
  • Mullinix, B. (2007, April). Strategies for diffusion of homegrown teaching innovations. Paper presented through the Faculty Training Evaluation and Development SIG at the American Educational Research Association (AERA). Chicago, IL.
  • Nuhfer, E. (2014, March 3). RE: [POD] Research on use of food at events. POD Listserv. Retrieved from POD@LISTSERV.ND.EDU
  • Rogers, E. (2003). Diffusion of innovations. 5th. New York, NY: Free Press.
  • Sahin, I. (2006). Detailed review of Rogers’ diffusion of innovations theory and educational technology related studies based on Rogers’ theory. The Turkish Online Journal of Educational Technology, 5(2), 14–23.
  • Schifter, C. (2000). Faculty participation in asynchronous learning networks: A case study of motivating and inhibiting factors. Journal of Asynchronous Learning Networks, 4(1), 15–22.
  • Sorenson, D. L., & Bothell, T. W. (2004). Triangulating faculty needs for the assessment of student learning. To Improve the Academy, 22, 23–40.
  • U.S. Department of Commerce. (n.d.). Census regions and divisions of the United States. Retrieved from http://www.census.gov/geo/maps data/maps/pdfs/reference/us_regdiv.pdf
  • Wallis, H. B. (Producer), & Curtiz, M. (Director). (1942). Casablanca [Motion Picture]. United States: Warner Bros.
  • Wergin, J. F. (2001). Beyond carrots and sticks. Liberal Education, 87(1), 50–53.
  • Zahorski, K. J. (1993). Taking the lead: Faculty development as institutional change agent. To Improve the Academy, 12, 227–245.