As educational developers working with multiple constituencies and demands on our time, how can we efficiently and creatively improve our programming and prioritize our efforts? In this chapter, we offer a simple heuristic to prompt quick yet generative examination of our goals or programs in relationship to three key characteristics of effective educa tional development on three different institutional levels. We then describe uses and applications of the tool and reflective process, which allow developers to efficiently gain insight into their work and effectively frame priorities for planning and improvement.

Higher education is in a state of unusual flux, confronting a rapidly changing environment with increasingly limited resources. Christensen and Eyring (2011) contend that this is a moment of "disruptive innova tion" that will lead to sudden, fundamental change similar to that experienced recently by newspapers, bookstores, and record companies. This turbulence presents a distinct challenge to our profession. More than ever before, we need to determine frequently, quickly, and creatively what activities and programming we should prioritize (Sorcinelli, Austin, Eddy, & Beach, 2006). We must consider not only how to do things better but when and how to do better things, including identifying emerging opportunities and discovering our gaps and blind spots. While these pressures can feel threatening and ominous, they also present dis tinct opportunities for us as individuals and as a field.

As educational developers, we are familiar with managing change. We routinely advocate for the kind of scholarly, reflective approaches that are essential in the face of disruptions. And we champion evidence informed innovation at our institutions as our field has become more scholarly by developing increasingly sophisticated approaches for evalu ating programs and measuring impact (Chism, 1998; Chism, Holley, & Harris, 2012; Debowski, 201la; Stefani, 2010). Such assessment efforts help us adapt to changing conditions while also documenting the impact of our work for ourselves and for broader audiences.

Yet rapidly evolving environments and institutions require us to be ever more nimble and to make adjustments even before undertaking full scale program reviews or annual assessments. Short of traditional, time intensive planning activities, how can we reflect on the work we do in our institutions in efficient and generative ways? Research suggests that developers "would welcome more opportunities for scholarly reflection on practice" (Sorcinelli et al., 2006, p. 166), but what tools do we have to help guide that work? Moreover, how do we make time for the kind of considered reflection characteristic of deep learning and necessary for understanding the complicated and often messy ideas and problems we encounter (Moon, 1999) or for the kind of structured analysis that will help us align the theories and values we espouse with those we actually employ (Schon, 1983, 1987)?

A Tool for Creative and Critical Reflection

Our framework emerged out of conversations we had in preparation for a 2011 AAC&U conference presentation on the characteristics of effective educational development programs in the changing landscape of higher education (Felten, Little, Ortquist-Ahrens, & Reder, 2011). We brought to those conversations our own professional experiences in diverse institutional contexts, as well as perspectives from the literature. Because that conference draws more deans and other senior adminis trators than it does developers, we could not assume our audience had an intimate knowledge of educational development practices or frameworks. We needed to highlight the most salient aspects of our field in ways that would explain our work to a diverse group of professionals from a wide variety of institutions.

As we refined and applied that draft framework on the heels of the conference, we found it to be surprisingly useful as a heuristic-a tool that supports an exploratory approach to problem solving that may lead to discovery, one that involves trial and error and creative thinking as much as critical analysis. Since that initial AAC&U session, we have used this framework on our own campuses and tested it in the 2011 POD con ference anchor session (Ortquist-Ahrens, Felten, Foster, Little, & Reder, 2011) and again in a session at the 2012 conference (Felten, Little, Ortquist-Ahrens, & Reder, 2012).

This heuristic is not a comprehensive framework for assessment but rather a tool to prompt guided reflection and discussion; it can serve as a point of departure for noticing patterns, locating gaps, and strategically envisioning future possibilities. The heuristic encourages divergent thinking that can facilitate a shift in perspective, helping us see the familiar in new ways or hinting at different possibilities for routine practices (Cropley, 2006). As Tosey (2006) argues, the intense and continuous change we face requires us to be more creative and agile than ever before, even if we operate in a system that can inhibit novelty and tend toward conservatism. This heuristic also provides an effective task constraint (Cropley, 2006; Tosey, 2006), an open-ended problem or challenge with rules to help sort through possibilities-creating an interplay of conver gent thinking, which Cropley asserts "always generates orthodoxy," with divergent thinking, which "always generates variability" (2006, p. 392).

The Structure of the Heuristic

Vertically our grid features three characteristics of effective educational development programs: people focused, context sensitive, and evidence informed. Horizontally the grid contains three perspectival levels on which our programming, influence, or leadership can operate and that need to be considered when designing programs or prioritizing our involvement for any campus: the individual, departmental, and institu tional (Table 11.1).

Characteristics

Each of the three major characteristics highlighted along the left side of the heuristic plays a role in the emerging agenda that Sarcinelli et al. (2006) sketch for the future of educational development in "the Age of the Network" (p. 157). Even as we broaden the scope of traditional faculty development, we should remain fundamentally focused on the people involved-though perhaps in new ways and with an eye toward rede fining diversity. Furthermore, to be most effective, we will need to be responsive to local context as well as grounded in scholarship (Sarcinelli et al., 2006).

PEOPLE FOCUSED Our work first involves the individuals and groups of stakeholders that it affects directly and indirectly: from faculty and graduate students at the individual level, to departmental life, under graduate students, senior administrators, and other stakeholders at the institutional level. For each, we must take into consideration people’s needs and their potential for growth. Filling out this cell in the heuristic means answering such questions as: Which individuals or groups are involved in or served by the program? What are their characteristics, needs, or contributions? Asking questions like these keeps institutional perspectives on students and their learning at the forefront, even when designing programs for individual faculty. It also means thinking devel opmentally about faculty needs at different career stages and then providing appropriate programming and support (Debowski, 201la). Focusing on people highlights how our work serves as a bridge between the sometimes conflicting interests of different stakeholders-for example, between faculty and administrators (Little & Green, 2012) and means that we are often called on to help facilitate organizational change (Chism, 1998; Latta, 2009). Even as we work on a microlevel, we must take a larger view of the complex interests and needs that both inform and drive our work. The heuristic also emphasizes that a devel oper’s role often operates on different levels, from professional or instructional development (individual) to midlevel unit development (deparnnental, curricular) to broader organizational development (insti tutional) (Debowski, 2011a, 2011b; Schroeder and Associates, 2011).

Table 11.1 The Heuristic

Individual Perspective (focusing primarily on individual needs of the people we work with one-on-one)Departmental Perspective (concentrating on needs and issues of groups within an institution, such as departments or programs)Institutional Perspective (considering systemic issues or trends from a broader perspective, including the perspectives of multiple stakeholders at or beyond the institution)
People focused: Planning with the needs of stakeholders in mind, including faculty, staff, and administrators, as well as students and others

Context sensitive: Taking into consideration institutional type, mission, size, specific student body, history and traditions, challenges, and goals

Evidence informed: Drawing on the scholarship of higher education, faculty and student success, and using evidence from data collected locally about teaching, learning, and programming

CONTEXT SENSITIVE Effective educational development programs fit to (and help shape) the culture, mission, opportunities, and constraints at each level of the institution. For this reason, programs or approaches cannot simply be transplanted from one college or university to another without careful modulation (what Carew, Lefoe, Bell, & Armour, 2008, describe as "Elastic Practice"; Sorcinelli et al., 2006). Programs and ser vices need to be carefully tailored to an institution’s distinctive priorities, culture, values, and resources (Latta, 2009; Milloy & Brooke, 2004) and for the individuals within that system. This row of cells on the heuristic opens up such questions as: What contexts do the individuals consulting with us come from, work within, need, or need to adjust to in order to thrive? What efforts would help the institution thrive as well?

Being context sensitive means that the work we do is shaped by the setting in which we are working and the circumstances of audience, politics, timing, or external pressures that inform it. Our work needs to take into consideration the environment, relationships, and power dynamics at each level in which we operate.

EVIDENCE INFORMED Our work must be informed by evidence about effective faculty development, as well as research on teaching and learn ing, including institution-specific information about student learning and experiences. This part of the heuristic raises questions such as: What lit erature or models are we drawing on to design and deliver our programs? How do we know they are effective and for whom? To succeed "in claiming and deserving the right to respect and credibility," we must, as Bath and Smith (2004) argue, "make explicit the research underlying both the theories of [our] discipline and [our] pragmatic engagement with the day-to-day teaching problems" we help others resolve (p. 24). Specifi cally, being evidence informed means drawing and building on a base of research and scholarship about student learning (Ambrose, Bridges, DiPietro, Lovett, & Norman, 2010; Brownwell & Swaner, 2010; Hutchings, Taylor Huber, & Ciccone, 2011; Kuh, 2008) as well as research on effective educational development practices (Chism et al., 2012; Kreber & Brook, 2001). If we are to meaningfully advocate for evidence-informed teaching and curricular approaches, we too must approach our work in an intentionally scholarly manner, inquiring sys tematically into its effectiveness, collecting our own evidence to learn how well our efforts are working, and then using that evidence to improve (Chism & Szabo, 1997; Debowski, 2011b; Felten, Kalish, Pingree, & Plank, 2007; Hines, 2009, 2011; Stefani, 2010). Being informed by evi dence means taking a scholarly approach to our work by using our knowledge of relevant research and contributing to it through ongoing assessment of our programs and services.

Perspectives

Again following the lead of Sorcinelli et al. (2006), we frame educational development as having the potential to foster both individual and insti tutional change. Many campus programs focus on one-on-one or small group work with faculty, driven by the needs of participants and oper ating within a culture of confidentiality. This attention to the individual faculty member or on the needs of specific groups makes educational development "safe" and ground-up, two essential values for many developers; it also makes our work sustainable in places where educa tional development resources, both people and funding, are scarce. However, this ethos can create barriers to broad-scale organizational development work, particularly when developers are perceived to be on the margins of campus, far from (and perhaps deliberately avoiding) places of influence and power (Chism, 1998, 2011; Schroeder & Associates, 2011). We recognize that developers need to consider multiple perspectival levels simultaneously, and we represent these in three col umns: individual, departmental, and institutional. Working through the columns of the heuristic raises questions about who else would benefit directly or indirectly from a program (and at what level), and it leads us to wonder who else we could involve or inform about it.

INDIVIDUAL Faculty development work has traditionally targeted the growth of individual faculty members. To do this effectively, we acknowledge our colleagues’ diversity related to individual identity and power (race/ethnicity, gender, sexuality, age, experience, ability, tenured/ untenured) as well as personality type and disposition. In addition, we consider the different disciplinary backgrounds, expectations, and pre ferences that characterize the particular contexts in which they teach. Our work must recognize that effective teaching can take various shapes and honor these differences, rather than advocate for any one right way to teach (Reder, 2007, p. 11).

DEPARTMENTAL While supporting individuals may remain central to our work, typically the academic department is a particularly relevant place for individual faculty members, whose identity often is deeply embedded in a disciplinary context and whose first loyalty is to a department. Wergin (2003) argues that engaged departments are one of the six key elements of a quality institutional climate and that the key to creating a successful department is engaging diverse constituencies in particular ways. As educational developers, we can play a role in improving the ways departments support individual faculty and design their curriculum, majors, and courses, helping them make evidence informed decisions to improve student experiences.

While most colleges and universities have departments, we recognize that at many institutions, there are other significant organizational units that may warrant inclusion in this portion of the heuristic, such as major programs, divisions, or colleges. Although on many campuses the department is primary for individual faculty members and mediates their relationship to the institution as a whole, this is not always the case; thus, this category should be modified as appropriate.

INSTITUTIONAL Effective educational development efforts also reach beyond individual needs to the overall health and effectiveness of the organization, remaining cognizant of priorities for student learning and the academic program. Sorcinelli et al. (2006) note that educational development work often is about making connections. In serving the needs of a changing professoriate and an increasingly diverse student body, teaching centers are more and more engaged in brokering colla borations among various constituencies across campus--departments, student and residential life, academic technologies, writing and quanti tative programs, student affairs, institutional research, and libraries-in support of student learning. The potential for an educational develop ment program to serve as a change agent within an institution should not be underestimated, especially at a time when the very nature of teaching and learning in higher education is being examined and dis cussed as never before (Arum & Roksa, 2011; Bass, 2012; Debowski, 201la; Latta, 2009).

Reflective Planning for Prioritizing and Improvement

We offer this heuristic to colleagues as a new tool. We have consciously aimed to keep it simple so that it can support short bursts of creative reflection, formative assessment, dialogue, and planning. In describing a sophisticated strategic planning model for educational development, Hunt (2006) asks a question that evokes a familiar dilemma: "What do you do when it seems that everything should be done at once?" (p. 75). Hunt (2006), Sarcinelli et al. (2006), Schroeder and Associates (2011), and others offer detailed planning frameworks to guide developers as we negotiate often competing requests that come from the top down and the bottom up at our institutions. Similarly, other authors offer a range of analytical tools for and approaches to summative program assessment (Hines, 2009; Plank & Kalish 2010; Stefani, 2010; Walvoord 2004) as they describe in-depth evaluation metrics for programming and services. Our heuristic is not intended to replace these comprehensive models but to complement them, providing an efficient perspective-taking exercise that can capture how educational development units are engaging stake holders at different levels and how well our programming matches our goals, constituent needs, and institutional priorities.

Successful educational development programs must be agile in responding to change. As a holistic tool supporting creative reflection on experience and practice, our heuristic functions as an instrument that can be used with a relatively low investment of time and other scarce resources. Moreover, it is designed to be flexible, so that developers in a range of situations can use or modify it to reflect their own contexts. We intend the heuristic as a guide to reflection that allows us to quickly assess some aspect of our programming or services. Times like these, Claxton (2006) contends, "do not succumb to methodological problem-solving using familiar constructs," but rather require the "soft creativity" nec essary to navigate multifaceted, "ill-defined and open-ended" situations (p. 357). As a heuristic rather than a script, this framework supports brainstorming and thinking beyond the familiar.

Suggestions for Using the Heuristic

When we experimented with the heuristic ourselves, we each followed a simple process. We allotted ourselves ten to fifteen minutes to identify a single program as a focus and then consider that program through the perspective of each cell in the grid. We discovered that some cells were more easily and quickly populated than others, particularly those in the "Individual Perspective" column; determining what we might include in others sometimes felt puzzling. We then discussed our thoughts and findings with each other and clarified in that exchange any new insights that emerged. For example, asking what a particular program had to do with the departmental perspective led us to ask new questions that we might not have fully considered before: How does our work with indi viduals or cohorts relate to departments, or, conversely, what impacts do departmental cultures have on our work? Should we plan programming at the department level or in concert with departments rather than working more exclusively with individuals or cohorts? Such previously unarticulated questions often prompted us to come to surprising and generative insights, and the entire process took only about thirty minutes from start to finish.

When we offered the heuristic as an invitation for guided reflection and dialogue at the 2011 and 2012 POD conferences, we urged colleagues to take an attitude of curiosity toward their work. We varied the initial prompts for each conference session. In both, we asked participants to begin with a particular point of departur ne to two specific programs they wished to assess (POD 2011) or a central goal, value, or mission for their work (POD 2012). Then we asked them to spend fifteen minutes filling out the heuristic individually with this program or goal in mind, using these prompts:

  • Which individuals orgroups are involved in or served by the program? What are their characteristics? Their needs? Their contributions?

  • What contexts do they come from? Work within? Need to thrive in?

  • Who else would benefit directly or indirectly from the program? Individuals? Department or units? The larger institution?

  • Who else might be informed or involved in programming?

  • What literature or models are you drawing on in designing and delivering your program?

  • How do you know about your program’s effectiveness?

As it had for us, this stage of the process proved vexing but productive for participants. The heuristic does not provide a simple grid to fill out with numbers of events or program goals; instead, it invites the user to determine the salient features of each category, how the categories relate, and which theories, implicit or explicit, might be shaping one’s practice. Using the heuristic is less about applying known data than articulating key questions and discovering possibilities.

Next, participants reflected, individually and in pairs, on what they discovered from using the heuristic, by considering the following ques tions: What do you notice? Where are the gaps or opportunities? What are the implications of what you notice for that program or your center more broadly? During this step, participants considered and discussed how the heuristic helped them understand aspects of their program, whether there were important things not represented in the grid, and what this all meant for their decisions about prioritizing and determining where to focus energies or resources. For the final step in the process, partici pants committed to concrete actions in response to this reflection cycle, naming what needed to be done, setting deadlines, and specifying the first action they would take toward that goal.

Applying and Adapting the Heuristic

We intend our heuristic to be neither prescriptive (not all cells must be filled in) nor definitive. It is meant as a starting point for others to adapt for their own contexts as appropriate. For example, when we each used the heuristic to examine new faculty programming on our various cam puses, we found that although we shared similar goals and some program design features and discovered similar areas to address, we found dif ferences in how we planned to act on those insights within our individual institutional contexts. When applying the heuristic to new faculty pro gramming, each of us discovered that completing the column in the heuristic from the individual and institutional perspectives was very easy, but the departmental perspective presented a challenge.

For all of us, the heuristic revealed that our new faculty workshop or seminar was doing little to address the stakeholders within the departments themselves. So we asked ourselves: Are there new ways to engage departments in new faculty programming to highlight comple mentary departmental efforts or initiatives? Are we taking advantage of opportunities to communicate with departments and other important constituencies about our work with new faculty? As individuals, we came up with different answers to these questions. One of us decided to improve communication with deparnnents about the seminar on a broader scale through a quick announcement at a meeting of department chairs and a follow-up e-mail with more information. Another of us decided we needed to cultivate relationships with a few key department chairs on an individual basis to learn how our programming was antic ipating and responding to the needs of their new faculty. In these cases, the heuristic helped make sense of granular data about our programs by looking at that program in a larger context. That "view from ten thou sand feet" yielded insight and suggested improvements in communication that will improve the program’s effectiveness and save our time in the long run, but it also took different forms more appropriate to our individual work at the local level.

Many participants in both the 2011 and 2012 conference sessions offered further adaptations and uses: to diagnose the relative contribution of multiple programs to the mission of the center; for assessing programs and also for analyzing the multiple relationships developed in the role as center director; to revisit the same program or goal at different moments and thus add a valuable temporal element to the insights gained. Still others found that layering different programs onto the grid allowed them to reflect on and assess center-wide programming in order to discover stakeholder needs that were not being met or places where the degree of focus on a particular group or need did not match the importance afforded it in broader institutional strategic priorities. Colleagues from Fairfield University left energized in 2011 by insights gained through the exercise, which they used as a gateway instrument that they built on with additional work in program evaluation and assessment, when they real ized that the in-depth analysis they wanted necessitated a more compre hensive tool (Klaf, Miners, & Nantz, 2012). These examples do not constitute an exhaustive inventory of ways to adapt the heuristic, but rather hint at the possibilities of how it might be used.

Beyond the time-effective nature of the approaches to using the heu ristic, it is a fruitful means for stimulating reflection and structuring conversations among colleagues about program planning and assessment from a wide-angle perspective. Based on insights generated through reflection, developers may choose not only which programs to redesign but also which initiatives and projects have the greatest likelihood of positive outcomes for multiple stakeholders and at multiple levels.

Conclusion

Our experience, coupled with that of our colleagues at various institutions who have now used the heuristic, suggests that the small investment of time using this simple tool for guided reflection can lead to valuable new perspectives and reframing of our work. Even thirty minutes of concen trated periodic reflection with the heuristic can uncover insights that will inform our practices and priorities-helping us to see where we might make adjustments to improve communication with multiple stakeholders; to deepen, broaden, or improve offerings; to work to effect systemic change or participate in departmental or institutional initiatives; or simply to avoid offering a scattered and unsystematic menu of options.

No single tool is appropriate in all settings and for all tasks, of course, and even with this tool, the reasoning and ultimate decisions of individual developers may differ significantly based on which specific values or groups of people hold priority within a particular institutional context. We have found the heuristic can help us clarify our priorities as well as articulate how different programs fit within the broader framework of our educational development work. The heuristic’s structure captures essential aspects of our field and might be used as a foundation for those seeking to undertake extensive assessment and strategic planning exer cises, and also as a theoretical basis for research on educational development.

While the heuristic is no substitute for the systematic evaluation of an external review or a major assessment of our programs, it provides a productive tool for responding to the dynamic forces confronting our field today. Being an agile and effective educational developer requires both deeply immersive planning and sustainable brief bursts of generative thinking. Efficient tools like this heuristic will help all of us to imagine possibilities and take new perspectives on our work, allowing us to contribute to, and perhaps even to lead, positive change at our institutions and in higher education.

References

  • Ambrose, S. A., Bridges, M. W., DiPietro, M., Lovett, M. C., & Norman, M. K. (2010). How learning works: Seven research-based principles for smart teaching. San Francisco, CA: Jossey-Bass.
  • Arum, R., & Roksa, J. (2011). Academically adrift: Limited learning on college campuses. Chicago, IL: University of Chicago Press.
  • Bass, R. (2012). Disrupting ourselves: The problem of learning in higher educa tion. Educause Review, 47(2), 23-33.
  • Bath, D., & Smith, C. (2004). Academic developers: An academic tribe claiming their territory in higher education. International Journal for Academic Development, 9(1), 9-27.
  • Brownwell, J.E., & Swaner, L. E. (2010). Five high-impact practices: Research on learning outcomes, completion, and quality. Washington, DC: AAC&U.
  • Carew, A. L., Lefoe, G., Bell, M., & Armour, L. (2008). Elastic practice in aca demic developers. International Journal for Academic Development 13(1), 51-66. doi:10.1080/13601440701860250
  • Chism, N.V.N. (1998). The role of educational developers in institutional change: From basement office to front office. In M. Kaplan (Ed.), To improve the academy: Resources for faculty, instructional, and organizational develop ment, Vol. 17 (pp. 141-154). Stillwater, OK: New Forums Press and the Professional and Organizational Development Network in Higher Education.
  • Chism, N.V.N. (2011). Getting to the table: Planning and developing institutional initiatives. In C. M. Schroeder & Associates, Coming in from the margins: Faculty development’s emerging organizational development role in insti tutional change (pp. 47-59). Sterling, VA: Stylus.
  • Chism, N.V.N., Holley, M., & Harris, C. J. (2012). Researching the impact of educational development: Basis for informed practice. In J. Groccia & L. Cruz (Eds.), To improve the academy: Resources for faculty, instruc tional, and organizational development, Vol. 31 (pp. 385--400). San Francisco, CA: Jossey-Bass/Anker.
  • Chism, N.V.N., & Szabo, B. (1997). How faculty development programs evaluate their services. Journal of Staff, Program, and Organizational Development, 15(2), 55-62.
  • Christensen, C. M., & Eyring, H.J. (2011). The innovative university: Changing the DNA of higher education from the inside out. San Francisco, CA: Jossey-Bass.
  • Claxton, G. (2006). Thinking at the edge: Developing soft creativity. Cambridge Journal of Education 36(3), 351-362. doi:10.1080/03057640600865876
  • Cropley, A. (2006). In praise of convergent thinking. Creativity Research Journal 18(3), 391--404. doi:10.1207/s15326934crj1803_13
  • Debowski, L. (201la). Emergent shifts in faculty development: A reflective review. In J. Miller & J. Groccia (Eds.), To improve the academy: Resources for faculty, instructional, and organizational development, Vol. 30 (pp. 306-322). San Francisco, CA: Jossey-Bass/Anker.
  • Debowski, L. (2011b). Locating academic development: The first step in evalu ation. In L. Stefani (Ed.), Evaluating the effectiveness of academic devel opment (pp. 17-30). New York, NY: Routledge.
  • Felten, P., Kalish, A., Pingree, A., & Plank, K. (2007). Toward a scholarship of teaching and learning in educational development. In D. Robertson & L. Nilson (Eds.), To improve the academy: Resources for faculty, instructional, and organizational development, Vol. 25 (pp. 93-108). Bolton, MA: Anker.
  • Felten, P., Little, D., Ortquist-Ahrens, L., & Reder, M. (2011). Linking faculty development with global learning and student success. Paper presented at the annual meeting of the Association of American Colleges and Universi ties, San Francisco, CA.
  • Felten, P., Little, D., Ortquist-Ahrens, L., & Reder, M. (2012). Prioritizing your center’s time and resources to meet 21st century demands. Paper presented at the meeting of the Professional and Organizational Development Net work in Higher Education, Seattle, WA.
  • Hines, S. R. (2009). Investigating faculty development assessment practices: What’s being done and how can it be improved? Journal of Faculty Development, 23(3), 5-19.
  • Hines, S. R. (2011). How mature teaching and learning centers evaluate their services. In J. Miller and J. Groccia (Eds.), To improve the academy: Resources for facuity, instructional, and organizational development, Vol. 30 (pp. 277-289). San Francisco, CA: Jossey-Bass/Anker.
  • Hunt, L. (2006). A community development model of change: The role of teaching and learning centres. In L. Hunt, A. Bromage, & B. Tomkinson (Eds.), The realities of change in higher education: Interventions to promote learning and teaching (pp. 64- 77). New York, NY: Routledge.
  • Hutchings, P., Taylor Huber, M., & Ciccone, A. (2011). The scholarship of teaching and learning reconsidered: Institutional integration and impact. San Francisco, CA: Jossey-Bass.
  • Klaf, S., Miners, L., & Nantz, K. (2012). Using PO D’s heuristic: Finding the big picture in the pixels. Paper presented at the meeting of the Professional and Organizational Development Network in Higher Education, Seattle, WA.
  • Kreber, C., & Brook, P. (2001). Impact evaluation of educational development programmes. International Journal of Academic Development, 6(2), 96-108.
  • Kuh, G. D. (2008). High-impact educational practices: What they are, who has access to them, and why they matter. Washington, DC: Association of American Colleges and Universities.
  • Latta, G. F. (2009). Maturation of organizational development in higher educa tion: Using cultural analysis to facilitate change. In L. B. Nilson & J. E. Miller (Eds.). To improve the academy: Resources for faculty, instructional, and organizational development, Vol. 18 (pp. 32-71). San Francisco, CA: Jossey-Bass/Anker.
  • Little, D., & Green, D. A. (2012). Betwixt and between: Academic developers in the margins. International Journal for Academic Development, 17, 203-215.
  • Milloy, P. M., & Brooke, C. (2004). Beyond bean counting: Making faculty development needs assessment more meaningful. In C. Wehlburg & S. Chadwick-Blossey (Eds.). To improve the academy: Resources for faculty, instructional, and organizational development, Vol. 22 (pp. 71-92). Bolton, MA: Anker.
  • Moon, J. (1999). Reflection in learning and professional development. London: Kogan Page.
  • Ortquist-Ahrens, L., Felten, P., Foster, L., Little, D., & Reder, M. (2011). Con ceptualizing our work: Characteristics of effective teaching and learning programs. Paper presented at the meeting of the Professional and Organi zational Development Network in Higher Education, Atlanta, GA.
  • Plank, K. M., & Kalish, A. (2010). Program assessment for faculty development. In K. Gillespie, D. L. Robertson, & Associates (Eds.), A guide to faculty development (2nd ed., pp. 135-149). San Francisco, CA: Jossey-Bass.
  • Reder, M. (2007). Does your college really support teaching and learning? Peer Review, 9(4), 9-13.
  • Schroeder, C. M., & Associates. (2011). Coming in from the margins: Faculty development’s emerging organizational development role in institutional change. Sterling VA: Stylus.
  • Schifo, D. A. (1983). The reflective practitioner: How professionals think in action. New York, NY: Basic Books.
  • Schon, D. A. (1987). Educating the reflective practitioner. San Francisco, CA: Jossey-Bass.
  • Sarcinelli, M. D., Austin, A. E., Eddy, P. L., & Beach, A. L. (2006). Creating the future off acuity development: Learning from the past, understanding the present. Bolton, MA: Anker.
  • Stefani, L. (Ed.). (2010). Evaluating the effectiveness of academic development. New York, NY: Routledge.
  • Tosey P. (2006). Interfering with the interference: An emergent perspective on creativity in higher education. In N. Jackson, M. Oliver, M. Shaw, & J. Wisdom (Eds.), Developing creativity in higher education: An imaginative curriculum (pp. 29-42). New York, NY: Routledge.
  • Walvoord, B. E. (2004). Assessment clear and simple. San Francisco, CA: Jossey Bass.
  • Wergin, J. F. (2003). Departments that work: Building and sustaining cultures of excellence in academic programs. Bolton, MA: Anker.

The authors are listed in alphabetical order, and all contributed equivalently to the ideas and writing of this chapter. We ask that any citation of this article list all four authors.