Abstract

Some scholars want help from educational developers to become better, more prolific writers. This study examines one such program, Publish & Flourish, that holds participants accountable for writing daily and for receiving weekly feedback from peers on drafts of writing. In this mixed methods study, 95% of participants (N = 93) reported that they improved their writing by making it more organized and reader centered. Participants also reported that they increased their extrapolated average of submissions of scholarly manuscripts per year from about two to almost six. We then compared Publish & Flourish to several other studies of scholarly writing programs, each of which had only a few participants or failed to have participants report their minutes of writing on a shared and monitored spreadsheet. We concluded that if daily writing is required and reported on a shared and monitored spreadsheet, scholarship programs can be scalable to large numbers of participants. Therefore, Publish & Flourish, with its accountability for writing daily, may be superior to other programs for educational developers who want to improve writing and jumpstart research productivity with many participants on a large scale.

Keywords: scholarly productivity, scholarship, faculty development, graduate student development

Some scholars want help from educational developers to become better, more prolific writers. Much academic writing is described as “turgid” (Mills, 1959, p. 217; Sword, 2012, p. 5; Williams & Bizup, 2016, p. 5). In some quarters, scholars actually maintain that convoluted prose is required by peers and editors (Billig, 2013; Limerick, 1998). As higher education increasingly comes under fire, writing in this tangled way causes academics to seem ever less relevant in contributing to policy: “The legislators are convinced that the university is… wasting state resources on pointless research. Under these circumstances, the miserable writing habits of professors pose a direct and concrete danger to higher education” (Limerick, 1998, p. 201).

Some scholars also face a problem with productivity. Lack of productivity has negative implications for the advancement of knowledge and of careers. Every academic has good ideas. Knowledge suffers when those ideas are not formalized, published and made accessible to others. Furthermore, faculty members who produce fewer publications have fewer opportunities for advancement. Indeed, “counting pubs” is the de facto standard in many colleges and universities and is unlikely to change in the near future (Bellas & Toutkoushian, 1999; Jensen, 2017). Scholarly productivity is the single best predictor of faculty salaries across many types of institutions (Fairweather, 2005).

Although scholarly productivity is the “coin of the realm,” many academics “struggle mightily” to accrue it (Jensen, 2017, p. 4). This is especially true for women and faculty of color, although the gap is shrinking (Eagan et al., 2014; Rice, Sorcinelli, & Austin, 2000; Schuster & Finkelstein, 2006; Yun & Sorcinelli, 2009). The problem starts in graduate school when students first experience challenges in scholarly productivity due to high expectations and inadequate training (Humphrey & Simpson, 2012). The problem continues into the professoriate, as shown by a national survey which found that a full 28% of faculty report not having published a manuscript or having a manuscript accepted in the past two years (Eagan et al., 2014 p. 30).

Low productivity for some scholars is not a new phenomenon and educational developers are addressing it. More than 30 years ago, Boice (1984) discussed the problem in the well known article, “Why Academicians Don’t Write.” According to Boice, there are many reasons including lack of time, distractions, writing blocks, and the inherent difficulty of writing (Boice, 1984). Educational developers can continue to address these issues by hosting writing programs that help scholars improve their writing and increase their productivity (Boice, 1989; Gray & Birch, 2000; McGrail, Rickard, & Jones, 2006).

The current study examines one such writing program, Publish & Flourish, which aims to help scholars become better, more productive writers. The study echoes and extends the findings of three comparison studies: McGrail et al. (2006), Boice (1989) and Gray and Birch (2000). The first comparison study was a meta analysis of three types of writing programs, including writing workshops, support groups, and coaches (McGrail et al., 2006). These studies seemed to improve writing productivity as shown by pre/post data (McGrail et al., 2006). In fact, McGrail et al. (2006) identified six studies that measured pre/post publication rates and found that publication rates improved at least twofold.

These six studies had three limitations (McGrail et al., 2006). First, the number of scholars in each study was small, ranging from 5 to 40. The most remarkable results came from the two smallest studies, one with N = 5 (McVeigh et al., 2002) and one with N = 8 (Page Adams, Cheng, Gogineni, & Shen, 1995). In these two studies of writing support groups, the pre versus post rate of publication increased by 4 and 9 times, respectively. Therefore, these studies greatly influenced the finding that the six studies more than doubled productivity. Second, the authors of the smallest studies did not mention that they excluded outliers (participants whose productivity was much higher than that of other participants). This is a problem because in one study, 19 manuscripts were submitted or published by eight participants and nearly one third of those manuscripts were written by one person (Page Adams et al., 1995). Outliers such as this are problematic in a small sample because they lead researchers to overestimate the effects of an intervention. Third, these studies did not examine improvements in writing quality. The current study responds to the limitations of these six studies by increasing sample size, excluding outliers, and measuring the extent to which scholars thought the quality of their prose improved.

The second comparison study, a classic, was undertaken by the guru of scholarly publishing, Robert Boice (1989). Boice is a behavioral psychologist whose writing on the subject includes 20 articles and four books. Boice (1989) used a strong intervention for a few scholars who were previously unproductive. The intervention seemed to increase scholarly productivity by requiring participants to write daily, keep records of minutes spent writing, and accept surprise office visits from the researcher during their scheduled writing times. To begin his program, all participants attended a series of three hour workshops. In these workshops, Boice explained the importance of writing daily and keeping records of minutes spent writing. At the end of the workshops, the first group of participants chose to continue writing occasionally in big blocks of time as they had prior to the intervention. An experimental group kept records of their writing minutes daily and received coaching from Boice and his assistant during their surprise visits. Boice measured the increase in productivity by measuring the average pages written or revised by each participant per week (Boice, 1989, p. 609). We have extrapolated the weekly averages to yearly averages. In the control group, the participants wrote or revised an extrapolated average of 17 pages during the study year; in contrast the participants in the experimental group averaged 157 pages (Boice, 1989, p. 609).

The Boice study had five limitations. First, the sizes of the experimental and control groups were small, only 10 each (Boice, 1989, p. 609). Second, no mention is made of removing outliers. Third, the “surprise office visit” intervention was so intensive that few educational developers would want to replicate it by surprising scholars in their offices during their assigned writing time. Fourth, Boice noted that the participants in his study were previously unproductive. Because the control group wrote or revised only 17 pages during the study year, this low base of pages allowed the experimental group to produce more by a huge factor. Again, the experimental group wrote or revised 157 pages per year, or more than the control group by nine times (Boice, 1989, p. 609). If instead, the writers in the control group had written, say, 100 pages per year, the experimental group would have had to produce 900 pages of prose per year to have the same factor increase. Fifth, changes in the perceived quality of prose were not measured. As we designed the current study, we wondered whether a substantial improvement in writing productivity and writing quality could be achieved by a larger group of more productive writers with a less intensive intervention.

The third comparison study was of an earlier version of Publish & Flourish (Gray & Birch, 2000). The earlier version was designed to build on the successes and address some of the limitations of Boice (1989). Like Boice (1989), the program encouraged scholars to write daily. Unlike Boice, Gray and Birch (2000) was designed for larger numbers (N = 115) of more productive scholars who were held accountable without office visits from the researchers. During this first version of the program, participants wrote an average of two days a week and produced an extrapolated 75 pages per year of new prose—or 150 pages of new or revised prose if each page of prose was also counted as revised. (Note that each page of new prose could only be counted as being revised once.) Unfortunately, the program had two important limitations. First, in a program designed for participants to write every day, participants only wrote two days a week. Second, participants were not asked to report how many pages they wrote prior to the intervention so no pre/postcomparison was possible.

In designing the current version of Publish & Flourish, we sought ways to improve upon the first version by helping participants write daily rather than two times a week (Gray & Birch, 2000). In the first version, participants were held accountable for writing daily by recording minutes spent writing on paper forms submitted to the program director only. In the most recent version, participants recorded minutes in a spreadsheet that was shared with all program participants and monitored by a Centers for Teaching and Learning (CTL) staff member. We predicted that peer pressure would increase the number of days of writing per week as well as scholarly productivity. Unlike Boice (1989), there was no control group; instead, we compared productivity changes in the same group of writers before and during the nine week intervention.

In addition to increasing productivity, both versions of Publish & Flourish aimed to help scholars become better writers in two ways. First, participants learned how to organize academic paragraphs around key or topic sentences, which is more difficult than it would appear to be. This method of revision by paragraph stands in sharp contrast to the easier but less effective method of revising sentence by sentence (Belcher, 2009; Booth et al., 2016; Gray, 2015; Williams & Bizup, 2016). The leading American expert on student writing reported that, when he is asked to help faculty with their writing, he comments on topic sentences more frequently than anything else (Bean, 2011, p. 325).

Second, participants exchanged drafts of writing in weekly writing groups. The groups were an additional way to hold participants accountable for writing regularly. The groups also provided a means for scholars to get formative feedback from others, which has been shown to improve manuscripts on 33 of 34 measures (Goodman, Berlin, Fletcher, & Fletcher, 1994). During the groups, both the writer and the readers read the manuscript simultaneously; as they read, they identified key sentences. Having the two parties read the manuscript and search for keys simultaneously has the same benefits for the writer as watching oneself on a video recording; the writer begins to see his or her writing as a reader would (Gray, 2015; Gray, Birch, & Madson, 2013). Searching for key sentences forces a close read because readers have to decide which sentence is key—or write a key sentence if one is missing. This method of reading helps readers generate suggestions for improving prose, including and beyond writing with key sentences (Gray, 2015).

In sum, the current mixed method study echoes and extends the findings from McGrail et al. (2006), Boice (1989) and Gray and Birch (2000). The study addresses three questions. To what extent does Publish & Flourish:

  1. increase productivity as compared to McGrail et al. (2006) and Boice (1989) given that Publish & Flourish was conducted on a larger scale and removed outliers?

  2. increase productivity as compared to Gray and Birch (2000), given that Publish & Flourish now holds participants accountable with a shared and monitored spreadsheet?

  3. improve the quality of participant prose?

Methods

Participants

This mixed method study was conducted at New Mexico State University during the years 2011, 2013 and 2015. Participants were recruited through mass emails sent to about 1,000 graduate students and 1,000 full and part time faculty. The sample was not representative of the population; it was a sample of highly motivated scholars determined to become better, more prolific writers. Participants could choose to attend just one half day opening workshop or attend both the opening workshop and the writing groups that followed; the results reported below come from those who attended both (N = 99). Study participants signed letters of informed consent during the opening workshop and the study was approved by the Institutional Review Board.

Intervention

As mentioned above, Publish & Flourish began with an opening workshop, followed by writing groups that met an hour per week for nine weeks. In the opening workshop, participants were encouraged to take several steps. They were asked to write every day, even if only for 15–30 minutes, and to hold themselves accountable to others by recording their daily minutes of writing on a shared and monitored spreadsheet. Participants were also asked to organize paragraphs around key sentences. Finally, participants learned a protocol for using key sentences to exchange feedback on each other’s work during writing groups.

Before work in groups could proceed, groups were formed and leaders were chosen. Each group included a stable membership of three or four writers, including a group leader. To form groups, participants were divided between faculty and graduate students and then assigned to groups loosely by discipline (STEM vs. non STEM). The writing group leader acted as a facilitator and writing coach. All group leaders were recruited by the program director in the weeks before the opening workshop. The faculty leaders and some of the graduate student leaders were successful alumni of Publish & Flourish. The remaining graduate student leaders were recruited from frequent participants at the CTL and from the rhetoric and professional writing program.

For the weekly writing groups, each participant brought three pages of a manuscript and each manuscript was reviewed for 15–20 minutes. During the first five minutes, participants searched each paragraph for a key sentence. Then, participants shared which sentence they thought was key. If answers differed, discussion ensued. Once differences were discussed, participants focused on any other aspect of the paragraph that needed attention. Each manuscript was examined paragraph by paragraph until there was one minute left. At that time, the process stopped and a “positive” round ensued. During the positive round, participants showered the writer with praise for the good things in the manuscript. The idea was to help the writer want to write another day (Gray, 2015; Gray et al., 2013).

Much has been written about Publish & Flourish. For more detail about the contents of the opening workshop and how the writing groups are facilitated, see Gray (2015). For more detail about how educational developers can direct Publish & Flourish without a large budget, a lot of time, or a specially trained staff, see Gray and Birch (2000) and Gray et al. (2013).

Evaluation

Using self report data, we measured the extent to which participants became better, more prolific writers. To measure productivity, participants were required to record their minutes spent writing each day on a shared and monitored spreadsheet. The spreadsheet was shared among all participants because we suspected that peer pressure would help participants write more often, which in turn would increase productivity. We knew that some writers would require reminders to continually record their minutes on a spreadsheet. Therefore, the spreadsheet was monitored every week by a CTL staff member. On odd weeks, the staff member began by dividing the writers into two groups: those who had reported all their daily minutes (including zeros) and those who had not. She then sent each group a different email that began with an inspirational quotation about writing. In one email, writers who had completed the spreadsheet were also applauded and thanked. In the other email, writers who had not recorded their minutes were reminded to do so. On even weeks, the staff member sent personalized reminders to those few participants who still had not completed the spreadsheet. She reported that sending those emails took about 20 minutes a week.

We then compared participant productivity before and after the program via confidential pre/postsurveys. (The surveys were confidential rather than anonymous to allow us to collate pre/postresponses.) In the pre program survey, participants were asked to use their computer records to estimate their number of pages written and revised and number of manuscripts submitted during the previous 24 weeks (i.e., during the fall semester, winter break, and first few weeks of the spring semester before Publish & Flourish began). In the postprogram survey, participants were again asked for the same estimates during the nine weeks of the program.

On both the pre and postprogram surveys, participants reported pages written and revised as well as manuscripts submitted. Pages were tracked in terms of new pages written and pages revised. Participants reported the number of new pages written before and during the intervention including pages in grant proposals, journal articles, peer reviewed conference proceedings, book reviews, book proposals, books, theses, or dissertations. Participants also reported the number of pages revised both before and during the intervention. A page could be counted as revised only once so a scholar who wrote 100 pages and revised them all could report no more than 200 pages. Participants also reported the number of manuscripts submitted during the study period. Manuscripts were defined as articles, chapters, or proposals to publishers, granting agencies or to a graduate committee. Participants were told that if they submitted three chapters of a book or thesis, it would count as three manuscripts. Graduate students were told that submissions did not include papers written as assignments for classes.

Finally, we administered a separate, anonymous survey using Likert scaled and open ended questions to measure perceptions of changes in both productivity and writing quality. Five Likert scaled items ranged from 1 = disagree to 7 = agree. Participants were said to agree if they answered with a 5 or higher. Scale statements included the following five items with the stem, “The program helped me…” (a) write more often, (b) write more minutes per week, (c) improve my writing, (d) write paragraphs that are organized around key sentences, and (e) get meaningful feedback from readers. Two open ended questions were asked: As a result of Publish & Flourish, (a) how did your writing process change (i.e., how you write, how often you write)? and (b) how did your writing product change (i.e., how well you write and how much you got written)?

Results

Over half of the participants who attended the opening workshop elected to finish the nine week program. To be exact, 99 of the 177 participants in the opening workshop finished the program: 37 in 2011, 30 in 2013, and 32 in 2015. Prior to analyzing our research questions, we removed six participants from the sample as outliers. An outlier was defined as a participant who produced no writing during the program or whose preprogram productivity (measured in pages written) was more than 2.5 standard deviations from the sample mean. After we removed six outliers, 93 participants remained in the data set.

Table 1 shows the characteristics of the participants. To summarize, most were women (62%). Participants were faculty (54%), graduate students (40%), or staff (6%). Most were untenured (87%). Participants came from a variety of disciplines.

Table 1. Demographics of Publish & Flourish Participants 2011–2015
GenderFemale 62% (n = 58), Male 38% (n = 35)
Status1Faculty 54% (n = 50)
Graduate student 40% (n = 37)
Staff 6% (n = 6)
Tenure statusTenured 13% (n = 12)
Untenured 87% (n = 81)
FieldAgriculture 4% (n = 4)
Arts and humanities 16% (n = 15)
Business 4% (n = 4)
Education 9% (n = 8)
Engineering 12% (n = 11)
Health 5% (n = 5)
Library 8% (n = 7)
Life/Physical sciences 8% (n = 7)
Social sciences 27% (n = 25)
Other 8% (n = 7)
1During the program, graduate students as compared to faculty and staff wrote slightly fewer days/week (4.0 vs. 4.4). Graduate students wrote or revised about the same number of pages (104 vs. 109). They were less likely to submit one or more manuscripts (51% vs. 70%), perhaps because they did not have as much background research completed before the start of the study.

Our scholars reported submitting publications at a rate of four manuscripts in the last two years. Although submissions are not publications, there should be a strong correlation between the two. The productivity of participants, therefore, was in the high to moderate range as compared to faculty across the United States. That is, about half of faculty across the United States reported that they have published between one and four professional writing outputs in the last two years (Eagan et al., 2014, p. 30).

When we started the study, we asked to what extent Publish & Flourish helps scholars become better, more productive writers. We began by asking how our results compared to the studies reviewed by McGrail et al. (2006), given that we used a far larger sample and excluded outliers. Participants in our study produced 2.7 times the number of submissions during the intervention than prior to the intervention. Put another way, a typical participant in this study submitted one manuscript in the 24 weeks prior to the intervention and submitted a second manuscript during the 9 weeks of the intervention (i.e., in about a third of the time). Extrapolating both averages over a year’s time, participants in the study would have submitted about two manuscripts per year before the intervention and almost six manuscripts during it. Despite the important differences between the studies, this result is consistent with the “more than double” increase reported by McGrail et al. (2006).

Next, we asked how findings compared to Boice (1989), given that Publish & Flourish was designed for more numerous and more productive writers who were held accountable with a much less intensive intervention. Predictably, Publish & Flourish did not increase scholarly productivity by as great a factor as Boice (1989, p. 609), a study in which 10 formerly unproductive writers in the experimental group outperformed those in the control group by a factor of nine. The 93 previously productive writers in this study increased their productivity by a factor of 2.7. Also predictably, the previously productive writers in this study produced almost twice as much new or revised prose per year after the intervention as the experimental group of writers in Boice (290 vs. 157 pages).

We asked, too, how findings in this version of Publish & Flourish compare to Gray and Birch (2000), given that this version included a shared and monitored spreadsheet rather than private reporting to the program director only. Specifically, in the current version of Publish & Flourish, faculty, staff and graduate students wrote 4.2 days a week and produced an extrapolated average of 290 pages written or revised per year. In Gray & Birch (2000, p. 275), participants wrote 2 days a week and produced an average of 150 pages. To summarize, Table 2 shows that participants in this later version of Publish & Flourish wrote or revised about twice as many days and twice as many pages as participants in the earlier intervention (Gray & Birch, 2000).

Table 2. Possible Impact of a Shared and Monitored Spreadsheet
Gray and Birch (2000)Current Study
Days of writing per week24.2
Pages written or revised per year1501290
1In Gray & Birch (2000, p. 275), participants would have produced 75 pages of new prose had the pages they wrote in the semester long program been extrapolated over a year. During the intervening years, we noticed that Boice counted new and revised pages. In the current study, to make the findings more consistent with Boice (1989), pages were counted as written and revised, so a page was counted twice—once as written and once as revised. The 75 pages of new prose reported in Gray & Birch (2000, p. 275) is more nearly equivalent to 150 pages of new or revised prose. This means that participants who wrote two days a week in Gray and Birch (2000) produced about half as many pages as those in the current study who wrote 4 days per week.

Next, on the anonymous surveys with Likert scale and open ended questions, we asked participants to share their perceptions about whether and why Publish & Flourish helped them become more prolific writers. Responses to the five Likert scale items were uniformly positive. Ninety five percent (n = 88) of the participants stated that they wrote more often and more minutes total. In response to an open ended question, some participants detailed that they had completed a number of projects during the nine week period. This is as one might expect from a group that wrote so regularly:

I was able to get five different manuscripts ready for submission.

“I had several deadlines…” I had several deadlines this semester and was able to meet them and even add a couple of grant proposals to my list.

Other participants explained why Publish & Flourish helped them become more productive.

I wrote nearly every day of the program for at least 15 minutes straight. Often, after the 15 minute mark, I had more to say and I just kept writing.

I loved how writing a little each day made me think about my writing when I WASN’T at my desk. Good ideas and phrasing seemed to hit me throughout the day.

Overall, participants seemed to become more productive because they started to write every day and because they thought about their writing throughout the day.

Finally, we asked participants whether Publish & Flourish helped them become better writers. Ninety five percent (n = 88) also agreed with this statement. This might be because of feedback from readers and because of the requirement to organize around key sentences. When participants were asked if they had received meaningful feedback from readers, 94% percent (n = 87) agreed. In response to open ended items, participants noted that feedback helped them to think more like readers.

Through interactions with other program participants, I developed the skill of thinking as my reviewers or readers would.

After my group read my draft, I approached editing more confidently knowing exactly which changes would make my draft more… comprehensible to readers.

The participants appreciated the encouragement to make their prose more reader centered.

There was another reason why participants improved their writing. Ninety five percent (n = 88) of participants reported that they became better writers because of the emphasis on organizing around key sentences.

Making sure that I had my key sentence for each paragraph really improved the clarity of my writing and helped me to organize my thoughts.

I have worked on organization by using the “key sentence” concept. Thanks to the simple idea, I think my work is more readable, compelling and well organized.

Using key sentences seemed to help writers organize their manuscripts.

In sum, participants reported that Publish & Flourish helped them write more and better prose. Perhaps they wrote more because they started writing more often and thought about their writing during the day. Perhaps they wrote better because they sought feedback from readers and organized their papers around key sentences.

Discussion

Publish & Flourish seemed to improve the quality of writing and increase scholarly productivity. Ninety five percent (n = 88) of participants reported that Publish & Flourish improved the quality of their writing. This was perhaps because participants got feedback from others about drafts of their writing and organized their papers around key or topic sentences. In addition, Publish & Flourish seemed to increase scholarly productivity by a factor of 2.7. That is, participants increased an extrapolated average of manuscripts submitted per year from about two to almost six. This was perhaps because of the shared and monitored spreadsheet and its associated peer pressure.

The increase in productivity of Publish & Flourish was similar to that in the six studies in the meta analysis by McGrail et al. (2006). It is remarkable that these productivity increases are similar because in McGrail et al. (2006), writing groups with the lowest number of participants (N = 5 and N = 8) yielded by far the greatest productivity, with outliers driving the results. The current study shows that the same increase in productivity can be achieved with a greatly increased sample size (N = 93) and with outliers excluded. Therefore, Publish & Flourish, with its emphasis on writing daily, may be superior to other programs for educational developers who want to jumpstart productivity with many participants on a large scale.

That said, productivity increases in Publish & Flourish differed from those in Boice (1989). The writers in this study could not reasonably increase their productivity by as big a factor (9) as the experimental vs. control group in Boice—if they had, they would have had to write or revise 966 pages per year (because they produced an extrapolated average per year of 107 pages of written or revised prose before the study period). Our relatively more productive writers did generate far more pages of scholarship than the experimental group of writers in Boice (1989). During the year long study period in Boice (1989), the experimental group of writers generated an average of 157 pages of written or revised prose (p. 609). In contrast, the writers in this study generated an extrapolated average of 290 pages or almost twice as many pages. This is as would be expected given that the control group of writers in Boice (1989) was previously unproductive as compared to the writers in this study.

Productivity increases were markedly better than those in the earlier version of Publish & Flourish (Gray & Birch, 2000). This increase was perhaps because of the peer pressure associated with a shared and monitored spreadsheet. This type of spreadsheet seemed to contribute to participants writing twice as many days and twice as many pages as they had in the earlier version of the program (Table 2).The study had four important limitations, which future studies should address. First, we did not monitor the writing of participants after the end of the study to determine the extent to which they remained better, more prolific writers. A future study might follow participants for a year or more after the intervention.

Second, we collected preintervention data retrospectively (for the semester before the study semester) (de Vaus, 2006). Ideally, participants would have been enrolled in Publish & Flourish during the semester prior to the study. However, this would have severely limited the number of participants because most participants do not register for professional development a full semester in advance.

Third, we examined submission rates rather than publication rates. A future study should explore the effect of writing interventions on publications rather than submissions. Although submissions and publications are strongly correlated, interventions may not affect the two identically. Indeed, a program like this one might yield a greater improvement in publication rates because of the feedback participants received and because 95% (n = 88) of participants reported that their writing improved.

Fourth, we lacked a control group. Without a control group, we cannot attribute the increase in productivity solely to Publish & Flourish. The need for a control group is shown by the work of Helen Sword, the internationally known researcher on scholarly writing. She interviewed or surveyed 1,200 scholars and found that seven out of eight do not write every day (Sword, 2016, p. 316; Sword, 2017, p. 15). In her words, “successful academics carve out time and space for writing in an impressive variety of ways” (Sword, 2016, p. 318). She urges scholars “to leave behind their hair shirts of scholarly guilt when they enter the house of writing. Productivity, it turns out, is a broad church that tolerates many creeds” (Sword, 2016, p. 321; Sword, 2017, p. 15). Sword is right that there are many good ways to write although we would argue that writing daily is the best. And her work lacks a control group as well. It begs the question, would writers who write daily outperform those who write occasionally? We think they would (see Table 2).

Daily writing has been shown to help unproductive and productive writers get started or up their game. Perhaps asking scholarly writers to write every day is akin to asking a mixed group of active and inactive adults to walk daily. Walking has advantages over other forms of exercise: it does not assume a preexisting level of fitness and it is an easy way to get started immediately. Some people who start walking daily are still walking years later, some switch to other forms of exercise, and some stop altogether. Still, it is good advice. And so it is for writers: writing daily is an easy way to get started immediately or to greatly increase a low or moderate level of writing. If you want large numbers of scholars to take writing to the next level, there is no better way. If big blocks of time prove illusive, try little ones.

In sum, educational developers should continue to help scholars become better, more prolific writers. Helping scholars provides a win win situation for faculty, educational developers, institutions, and the body of knowledge, as well as the public and policy makers. Faculty members benefit by enhancing their scholarship and, thereby, their careers. Educational developers benefit because it is much easier to document that your scholarly programs tripled productivity than it is to demonstrate that your teaching programs tripled teaching effectiveness. Institutions benefit from enhanced faculty performance. The body of knowledge benefits because scholars contribute more and better scholarship, thus extending knowledge and making it more accessible to the public and to policy makers.

Acknowledgments

We thank the following scholars who provided valuable feedback on earlier drafts of this paper: Jane Birch, Jean Conway, Brian Martin, Claire Rickard, Michael Rifenburg, Paul Silvia, Dannelle Stevens, and Helen Sword. Ereney Hadjigeorgalis deserves special mention for recommending a reorganization of the paper.

References

  • Bean, J. C. (2011). Engaging ideas: The professor’s guide to integrating writing, critical thinking, and active learning in the classroom. San Francisco, CA: Jossey Bass.
  • Belcher, W. L. (2009). Writing your journal article in 12 weeks: A guide to academic publishing success. Los Angeles, CA: Sage.
  • Bellas, M. L., & Toutkoushian, R. K. (1999). Faculty time allocations and research productivity: Gender, race and family effects. Review of Higher Education, 22, 367–390.
  • Billig, M. (2013). Learn to write badly: How to succeed in the social sciences. Cambridge, England: Cambridge University Press.
  • Boice, R. (1984). Why academicians don’t write. Journal of Higher Education, 55, 567–582.
  • Boice, R. (1989). Procrastination, busyness and bingeing. Behavior Research Therapy, 27, 605–611.
  • Booth, W. C., Colomb, Gregory, G. G., Williams, J. M., Bizup, J., & FitzGerald, W. T. (2016). The craft of research (4th ed.). Chicago, IL: The University of Chicago Press.
  • de Vaus, D. (2006). Retrospective study. In V. Jupp (Ed.), Sage dictionary of social research methods (pp. 268–270). New York, NY: Sage.
  • Eagan, K., Stolzenberg, E. B., Lozano, J. B., Aragon, M. C., Suchard, M. R., & Hurtado, S. (2014). Undergraduate teaching faculty: The 2013–2014 HERI Faculty Survey. Los Angeles, CA: UCLA Higher Education Research Institute.
  • Fairweather, J. S. (2005). Beyond the rhetoric: Trends in the relative value of teaching and research in faculty salaries. Journal of Higher Education, 76, 401–422.
  • Goodman, S. N., Berlin, J., Fletcher, S., & Fletcher, R. (1994). Manuscript quality before and after peer review and editing at Annals of Internal Medicine. Annals of Internal Medicine, 121(1), 11–21.
  • Gray, T. (2015). Publish & Flourish: Become a prolific scholar (2nd ed.). Las Cruces, NM: New Mexico State University Teaching Academy.
  • Gray, T., & Birch, A. J. (2000). Publish, don’t perish: A program to help scholars flourish. In D. Lieberman, & C. Wehlburg (Eds.), To improve the academy (Vol. 19, pp. 268–284). San Francisco, CA: Jossey–Bass.
  • Gray, T., Birch, A. J., & Madson, L. (2013). How teaching centers can support faculty as writers. In A. E. Geller, & M. Eodice (Eds.), Working with faculty writers (pp. 95–110). Logan, UT: Utah State University Press.
  • Humphrey, R., & Simpson, B. (2012). Writes of passage: Writing up qualitative data as a threshold concept in doctoral research. Teaching in Higher Education, 17, 735–746.
  • Jensen, J. (2017). Write no matter what: Advice for academics. Chicago, IL: University of Chicago Press.
  • Limerick, P. N. (1998). Dancing with professors. In V. Zamel, & R. Spack (Eds.), Negotiating academic literacies: Teaching and learning across languages and cultures (pp. 199–206). London, England: Routledge.
  • McGrail, R. M., Rickard, C. M., & Jones, R. (2006). Publish or perish: A systematic review of interventions to increase academic publication rates. Higher Education Research and Development, 25(1), 19–35.
  • McVeigh, C., Moyle, K., Forrester, K., Chaboyer, W., Patterson, E., & St John, W. (2002). Publication syndicates: In support of nursing scholarship. Journal of Continuing Education in Nursing, 33(2), 63–66.
  • Mills, C. W. (1959). On intellectual craftsmanship. In C. W. Mills (Ed.), The sociological imagination (pp. 195–226). New York, NY: Grove.
  • Page Adams, D., Cheng, L. C., Gogineni, A., & Shen, C. Y. (1995). Establishing a group to encourage writing for publication among doctoral students. Journal of Social Work Education, 31, 402–407.
  • Rice, R. E., Sorcinelli, M. D., & Austin, A. E. (2000). Heeding new voices: Academic careers for a new generation. Washington, DC: American Association for Higher Education.
  • Schuster, J. H., & Finkelstein, M. J. (2006). The American faculty: The restructuring of academic work and careers. Baltimore, MD: Johns Hopkins.
  • Sword, H. (2012). Stylish academic writing. Cambridge, MA: Harvard University Press.
  • Sword, H. (2016). ‘Write every day!’: A mantra dismantled. International Journal for Academic Development, 21(4), 312–322.
  • Sword, H. (2017). Air & light & time & space: How successful academics write. Cambridge, MA: Harvard University Press.
  • Williams, J. M., & Bizup, J. (2016). Style: Lessons in clarity and grace (12th ed.). Chicago, IL: University of Chicago Press.
  • Yun, J. H., & Sorcinelli, M. D. (2009). When mentoring is the medium: Lessons learned from a faculty development initiative. In L. B. Nilson, & J. E. Miller (Eds.), To improve the academy, Vol. 27. Resources for faculty, instructional, and organizational development (pp. 365–384). San Francisco, CA: Jossey–Bass.