4 Evaluating the Return on Investment of Faculty Development
Skip other details (including permanent urls, DOI, citation information)
This work is licensed under a Creative Commons Attribution-NonCommercial-NoDerivatives 4.0 International License. Please contact : [email protected] to use this work in a way not covered by the license.
For more information, read Michigan Publishing's access and usage policy.
How can the return on investment of faculty development be determined? One way to do this is through the application of a highly replicated and reported return on investment (ROI) process. This chapter reviews briefly an ROI process used by organizations throughout the world, a process that has been the basis for over 100 published studies and is the most validated and reported ROI process used for determining the monetary impact of learning. The process utilizes a five-level framework and a step-by-step ROI process model. These components are reviewed in this chapter and an example of return on investment based on student retention in a Freshman Seminar Program is explained.
BACKGROUND ON THE RETURN ON INVESTMENT PROCESS
Return on investment (ROI) evaluation has been conducted by hundreds of organizations to meet the demands of a variety of influential stakeholders. Training departments, consulting teams, executive leaders, and workshop facilitators have been striving to prove the value of their work for several decades. Among all their work, an ROI process has risen to the top as the most commonly used and replicated process for evaluating the return on investment of learning and development. This is the process attributed to the·work of Jack Phillips (1997a, 1997b).
The Jack Phillips ROI process is based on nearly 25 years of development. It is a process with many satisfied users; a process defined to meet the demands of many people (e.g., learning coaches, professional developers, consultants, trainers, and educators).
Consultants who have implemented the process report satisfaction with the process and claim it is methodical, systematic, easy to understand, and user-friendly. In addition, executives, managers, and professional evaluators give the process very high marks (Phillips, 2000).
The American Society for Training and Development acquired the Jack Phillips ROI network because the ROI process is the most commonly used procedure for holding training participants accountable and for justifying the costs of development programs (Baron, 2002). All of this points to the strength of the process and to a track record of success in meeting the needs of professional developers striving to determine the ROI of development activities.
RELEVANCE TO FACULTY DEVELOPMENT
Although the Jack Phillips ROI process is widely accepted in many fields, it has not been frequently applied to faculty development. Perhaps the need in faculty development has been small or the field of faculty development vastly differs from other professional development fields. Regardless of the reason for its limited to nonexistent use, the process seems applicable to faculty development. If the process works for training and development units that primarily serve clients through consulting and workshops, why not for faculty development units?
The types of data collected in the ROI process are the types of data that faculty development units need to understand the impact of faculty development activities. Three of the six data types relate to individual changes that can occur within faculty (reactions, learning, and behavior change). The other three data types relate to results that are institutional or important to the bottom line in higher education (institutional results, return on investment, and intangible results). The following is a review of each type of data collected with each step in the ROI process as it relates to faculty development.
REVlEW OF THE PROCESS
Overview
The ROI process, like many other evaluation processes, involves planning, data collection, data analysis, and reporting. Yet, unlike other evaluation processes, the ROI process is somewhat unique because each part of the process attempts to isolate effects and convert data to monetary values so that benefits of faculty development can be compared to costs. Further, the steps, techniques, assumptions, and calculations in the ROI process follow a conservative approach to build the credibility needed for acceptance of the process. Ultimately, the goal of the ROI process is to obtain data that can be used to calculate ROI.
The formula for the ROI calculation is a simple fraction and easy to calculate. However, collecting the data to put into the calculation can be challenging and must be credible. Figure 4.1 illustrates a simple ROI calculation.
Levels of Evaluation: A Five-Level Framework
The ROI process utilizes a five-level framework, with levels of evaluation that produce six types of data and a step-by-step ROI process model. The five levels of the framework are shown in Table 4.1. Each level represents a type of data collected through application of the ROI process. Thus, five types of data are collected that each corresponds to the levels of evaluation. The sixth type of data comes from data that is not converted to monetary values and therefore is labeled as intangible.
Level of Evaluation | Question Answered |
---|---|
Level 1: Reaction | How do participants of faculty development activities react to the faculty development activities? |
Level 2: Learning | What do the participants learn from the faculty development activities? |
Level 3: Behavior change or application | What specifically do participants of faculty development activities do differently on-the-job and after the faculty development activity? |
Level 4: Results (overall institutional results) | How does the entire institution benefit from the improvements individuals made because of the faculty development activities? |
Level 5: Return on investment | How do the benefits of the faculty development activity compare to the costs? |
The first four levels in the five-level framework were originally conceived by Donald Kirkpatrick (1996) as a model for evaluating training programs. Since their origination, the four levels have been expanded upon by Phillips’s work on the ROI process.
Planning
The first step in the ROI process is planning. Purposes of the faculty development activity are explored in this step. If a faculty development activity has an objective related to a bottom-line result for a higher education institution (e.g., student enrollment, graduation rates, alumni giving, increased funding through grants, faculty retention), then the activity is suitable for the ROI process. Thus, the purposes for the faculty development activity are matched up to the purposes for the evaluation. In addition, the timing of data collection procedures is determined for each level of evaluation and instruments and methods are selected.
Another important part of the planning phase of the ROI process includes the collection of baseline data. If data already exist that may be affected by the faculty development activity, then past and current values are collected to serve as baseline measures before the faculty development activity is implemented.
Data Collection
The data collection steps of the ROI process include collecting data during the faculty development activity and again some time after the faculty development activity. Levels 1 and 2 data (reaction and learning) can be collected during the faculty development activity and levels 3 to 5 (behavior change, institutional results, ROI) data can be collected sometime after the activity, as illustrated in Figure 4.2).
Typically, evaluators use instruments and methods such as questionnaires, interviews, focus groups, assessments, simulations, role plays, and observations to collect data about reaction, learning, and behavior change. The most popular seems to be the questionnaire administered at the end of a faculty development workshop that asks three to five questions about participants’ general reactions. Less popular are methods used for determining specifically what faculty do differently after a faculty development activity. Likewise, few studies about faculty development seem to determine influence on institutional results.
Much of the data collected for level 4, institutional results, can be found in the current systems within the institution (e.g., graduation rates, faculty complaints, student complaints, research articles published, research grants funded externally). Some of the data is more amenable to the ROI process, typically labeled “hard data,” and some of the data is less amenable to the ROI process, typically labeled “soft data.” The challenging part of the ROI process when it comes to level 4 data is converting data to monetary values and isolating the effects of faculty development on the data.
Hard Data Versus Soft Data
Hard data is characterized as objectively based, easy to quantify and measure, relatively easy to assign it monetary values, and credible to institutional leaders. Table 4.2 illustrates the types of hard data that many evaluation professionals use in the ROI process. The data is typically received from organizations in the business of manufacturing, sales, services, etc., but not necessarily for institutions of higher education. Some of the hard data are quality improvements, some are output increases, some of the data refer to time savings, and some are cost savings.
(not necessarily for academic organizations – but commonly used elsewhere) | |||
---|---|---|---|
OUTPUT | COSTS | TIME | QUALITY |
Units Produced | Budget Variances | Equipment Overtime | Scrap |
Tons Manufactured | Unit Costs | Overtime | Waste |
Items Assembled | Cost By Account | On Time Shipment | Rejects |
Money Collected | Variable Costs | Time to Project Completion | Error Rates |
Items Sold | Fixed Costs | Processing Time | Rework |
Forms Processed | Overhead Cost | Supervisory Time | Shortages |
Loans Approved | Operating Costs | Product Defects | |
Inventory Turnover | Number of Cost Reductions | Break in Time for New Employees | Derivation From Standard |
Patients Visited | Project Cost Savings | Training Time | Product Failures |
Applications Processed | Accident Costs | Meeting Schedules | Inventory Adjustment. |
Students Graduated | Program Costs | Repair Time | lnventory Adjustments |
Tasks Completed | Sales Expence | Efficiency | Time Card Corrections |
Output Per Hour | Work Stoppages | Percent of Tasks Completed Properly Number of Accidents | |
Productivity | Order Response | ||
Work Backlog | Late Reporting | ||
Incentive Bonus Shipments | Lost Time Days | ||
New Accounts Generated |
Soft data is characterized as subjectively based (in many cases), difficult to quantify and measure directly, difficult to assign monetary measurement to its value, and less credible to institutional leaders. Table 4.3 illustrates the types of soft data that many evaluation professionals use in the ROI process.
(not necessarily for academic organizations – but commonly used elsewhere) | ||
WORK HABITS | CUSTOMER SERVICE | EMPLOYEE DEVELOPMENT/ADVANCEMENT |
Absenteeism Tardiness Visits to the Dispensary First Aid Treatments Violations of Safety Rules Number of Communication Break-downs Excessive Breaks | Customer Complaints Customer Satisfaction Customer Dissatisfaction Customer Impressions Customer Loyalty Customer Retention Customer Value Lost Customers | Number of Promotions Number of Pay Increases Number of Training Programs Attended Requests for Transfer Performance Appraisal Ratings Increase. in Job Effectiveness |
WORK CLIMATE/SATISFACTION | INITIATIVE/INNOVATION | |
Number of Grievances Number of Discrimination Charges Employee Complaints Job Satisfaction Employee Turnover Litigation Organizational Commitment Employee Loyalty Increased Confidence | Implementation of New Ideas Successful Completion of Projects Number of Suggestions Implemented Setting Goals and Objectives New Products and Services Developed New Patents and Copyrights |
Whether soft or hard data, level 4 data needs to be isolated to the effects of faculty development and converted to monetary values when possible to prepare for level 5 calculations of ROI. Data not converted to a monetary value is collected and reported as intangible results.
Level 4 Results Influenced by Faculty Development
What are the level 4 results that faculty development influences? Some may be the same as those listed in Tables 4.2 and 4.3, yet most arc very different. There is no room for an exhaustive list in this chapter, but Table 4.4 illustrates some of the level 4 results, institutional results, that faculty development can influence. Some of the results arc hard data and some are soft data. Discovering and listing these types of results that a particular faculty development activity should influence is a step toward improving efforts to determine the ROI of faculty development.
Hard Data | Soft Data |
---|---|
Student Retention Student Enrollment Decreased Litigation Graduation Rates Time Savings Time Savings Following Up on Complaints Increases in Alumni Giving Faculty Turnover Reduced Costs Due to Ineffective Teaching Students Repeating Courses (costs to the state or school sponsor) | Faculty Job Satisfaction Student Job Placement Reduced Conflict Improved Teamwork Reduced Stress Campus Culture That Values Less Teaching Leadership Improvement Improved relationships with National Foundations, Associations and Federal Agencies |
Data Analysis
The third phase of the ROI process is data analysis. Steps in this phase include isolating the effects of faculty development, converting data to monetary values, capturing the costs of faculty development, identifying intangible results, and calculating the return on investment. The most difficult steps of the entire ROI process occur in this phase. Isolating the effects of faculty development and converting data to monetary values typically are the most challenging steps in the ROI process regardless of what is being evaluated.
Isolating the effects of faculty development may be the most important step in the ROI process. Without performing this step, the entire process can lose credibility and fail to provide an accurate picture of the return on investment of faculty development.
There are multiple techniques to isolate the effects of faculty development on institutional results, but a detailed discussion of each is beyond the scope of this chapter. Techniques include control group research arrangements, trendline analysis, forecasting, regression analysis, correlations, and expert estimates.
Likewise, there are multiple techniques for converting data to monetary values, but a discussion of each is not possible here. Converting data to monetary values is a very important step in the ROI process and is absolutely necessary to determine the monetary benefits from faculty development. Although the process is challenging, particularly with soft data, it can be methodically accomplished.
The ROI calculation is based on converting both hard and soft data to monetary values. Then, those values are compared to the costs of faculty development and converted to a percentage. A return of more than 15% to 25% on money spent for faculty development would be more than the typical return expected on other investments made to help institutions operate.
In addition to tangible, monetary benefits used in an ROI calculation, most programs will have intangible, nonmonetary benefits such as increased job satisfaction, increased organizational commitment, improved teamwork, and reduced conflicts. For most institutions, intangible, nonmonetary benefits are extremely valuable, often carrying as much influence as the hard data items.
Reporting the Results of the ROI Evaluation
The final phase of the ROI process is to report the results. Audiences interested in the ROI of faculty development would vary, but certainly some of the people in the audience would include all other faculty developers and university managers responsible for teaching and learning. Yet, regardless of the audience interested in the report of the results, strict adherence to the principles, phases, and steps involved in the ROI process will make the reporting phase more comfortable.
SUMMARY
The ROI process has not been fully applied to faculty development, but it seems to be highly applicable. Further research about what is currently being used to determine the value of faculty development is needed. Many faculty development units are most likely collecting data at levels 1 and 2. Few are likely to be collecting data at level 3, and few if any are collecting data at levels 4 and 5.
Understanding the ROI process could motivate more faculty development units to collect data at higher levels of evaluation and give a common language and framework for accomplishing the challenging task of proving the value of faculty development. One study at Washington State University demonstrates the effects of faculty development-type activities on level 4 results. That study will be discussed in the remainder of this chapter.
The following case study concerning a freshman seminar program attempts to demonstrate the value of faculty development-type activities on institutional results. The study was not a complete application of the ROI process and could be improved upon in several ways related to the ROI process, but the study is a good example of how level 4 higher education results can be influenced by faculty development-type activities.
Case Study: Return on Investment From the Freshman Seminar Program at Washington State University
Overview of the Freshman Seminar Program at Washington State University
Freshman Seminar courses at Washington State University (WSU) are spaces where students gather in learning communities linked to general education courses. The seminars create a space where active, generative learning takes place and where students collaborate to develop a research project from topics in their shared general education course. Experienced and trained undergraduate students serve as leaders in the role of peer facilitator, participating as an academic mentor or as a “hypernaut,” an undergraduate multimedia specialist. Graduate students serve as facilitators and assist and mentor the peer facilitators and hypernauts. Faculty in the linked courses also serve as mentors. Freshman Seminar students have created a Flash animation about the program accessible at http://salc.wsu.edu/freshman/details/flash_page.htm.
The Situation That Led to This Study
The Freshman Seminar Program had participated in many assessment studies since its inception but had not completed an ROI analysis. During the spring 2002 term, the Freshman Seminar Program came under review by a subcommittee of the faculty senate. The Center for Teaching, Learning, and Technology at Washington State University was contacted and asked to analyze the benefits and costs of the program in preparation for the subcommittee meeting.
Other Freshman Seminar Assessment Studies
Jean Henscheid, the original coordinator of the Freshman Seminar Program, recognized the important of assessment in analyzing and continually improving the program. Subsequent Freshman Seminar coordinators have maintained chat culture of assessment. The assessment findings during the early years of the program are summarized below (Henscheid, 1999):
1) Freshman Seminar students are nearly 5% more likely to be retained to the sophomore year than other freshmen (fall 1996 and fall 1997 cohorts).
2) Freshman Seminar students are, overall, not as well prepared academically as the general university freshman population, yet they achieve better overall grade point averages than like students in their first semester at WSU at all preparedness levels (fall 1997 cohort).
3) Surveys of Freshman Seminar students using the Flashlight Item bank showed:
Eighty-three percent of students in the seminars say the emphasis on working in groups helps them understand ideas and concepts taught in the course (fall 1998 cohort).
Eighty-six percent of students in the seminars say they are more comfortable participating in discussions in the Freshman Seminar than in other courses (fall 1998 cohort).
Compared to media-enhanced lecture classes, students are more likely to feel that they had learned to manage large, complex tasks (fall 1996).
Compared to media-enhanced lecture classes, students say they are more likely to feel that they have worked through a process to solve complex problems (fall 1996).
Because the students create projects, 78% say they are better able to communicate their ideas to others (fall 1998 cohort).
Because the students create these projects, 76% say they are better able to understand ideas and concepts taught in the course, and 79% say they are able to exercise their creativity.
Seminar students are statistically significantly more likely to read than other students, more apt to be actively engaged in their learning, cooperate with other students, have contact with faculty, and more likely to read basic references and documents (fall 1996 and spring 1997 cohorts).
The vast majority of seminar students say they would recommend a peer facilitated experience (all cohorts).
The Freshman Seminar has participated in Washington State University’s Goals, Activities, and Processes (GAPs) formative assessment survey since its inception during the fall 1999 term. A regression analysis of 2001 GAPs data showed that Freshman Seminar students scored statistically significantly “better” on eight out of nine questions relating to principles of good practice in undergraduate education than other on-campus Washington State University courses using web-based course management systems and that participated in the GAPs (sec Table 4.5).
Dependent variables regressed on categorical variable (O if Freshman Seminars, 1 if other WSU on-campus course) and age. | |||||||
The questions stem from the Goals, Activities, and Processes (GAPs) student survey asked: Because of the way your instructor or teaching assistant facilitated electronic communication (such as threaded discussions or streaming media) in this course, how likely were you to: | |||||||
Possible responses included: | |||||||
1 = Much less likely; 2 = Somewhat less likely; 3 = About the same; | |||||||
4 = Somewhat more likely; 5 = Much more likely. | |||||||
6 = Not applicable (these responses were removed from the analyses) | |||||||
Independent Variables | |||||||
Dependent Variables (Question Leaves) | Regression Coefficient of Categorical Variable | Standard Error of Categorical Variable Coefficient | Age Coefficient | Standard Error of Age Coefficient | N | Intercept | R2/ Adj. R2 |
Ask for clarification | -.518*** | (0.097) | -0.072 | (0.073) | 567 | 3.89 | .062/.059 |
Discuss course concepts with other students | -.301*** | (0.088) | -0.055 | (0.066) | 563 | 3.71 | .028/.025 |
Work on assignments with other students | -.869*** | (0.094) | -0.037 | (0.072) | 562 | 3.86 | .153/.150 |
Ask other students for comments on coursework | -.507*** | (0.098) | -0.029 | (0.071) | 556 | 3.55 | .055/.052 |
Feel isolated from other students | .555*** | (0.108) | 0.053 | (0.081) | 527 | 2.18 | .059/.055 |
Receive comments from the instructor quickly | -0.159 | (0.088) | -0.092 | (0.065) | 570 | 3.88 | .013/.010 |
Discuss course concepts with instructor | -.560*** | (0.091) | -0.054 | (0.067) | 561 | 3.76 | .079/.075 |
Make use of unique abilities to learn | -.418*** | (0.09) | -0.107 | (0.066) | 565 | 3.85 | .055/.052 |
Challenged to create own understanding | -.210*** | (0.089) | -0.078 | (0.068) | 559 | 3.81 | .017/.014 |
A qualitative analysis was conducted on Freshman Seminar focus groups. Four questions were asked:
1) What is your (the student’s) role in your learning?
2) What is the hypernaut’s role in your learning?
3) What is the peer facilitator’s role in your learning?
4) What is your definition of critical thinking?
In general, the responses were very positive. An average of all four questions show that 69% of the students in the focus group answered positively, 21% were neutral, and 10% were negative.
ROI Summary (Actual Data Is Not Used to Protect Privacy, but Final Results Are Comparable to the Actual Study)
The focus groups, student Flashlight surveys, GAPs surveys, and analyses of grade point averages all highlighted very positive results from the Freshman Seminar Program but would be classified as soft data because they are difficult to assign monetary value to. However, the increased retention from the Freshman Seminar programs (Henscheid, 2001) provided an opportunity to put a monetary value on at least some of the benefits. A question then immediately presented itself: Would an additional 4% or 5% retention rate pay for the program?
Revenue Assumptions
Using the average increased retention rate of the Freshman Seminar Program, the increased Average Annual Full Time Equivalents (AAFTEs) could be estimated going into the sophomore year. The estimated increase in juniors and seniors was estimated by multiplying the previous year’s increase times the retention rate for that year and then truncating the results. For example, the retention rate for sophomores to juniors at Washington State University is approximately 90%. The sophomores retained as a result of the Freshman Seminar program over and above the sophomores not in the seminar program were estimated to be 18.9 students. That estimate was truncated to 18 students.
Estimated Increase in AAFTEs | |
---|---|
From freshman to sophomore | 21 |
From sophomore to junior | 18 |
From junior to senior | 16 |
Total | 55 |
The next question became, would an additional 55 students at WSU (as a result of the Freshman Seminar Program) generate enough revenue to cover the costs of the program and generate a positive ROI? The annual increase in revenue was estimated by multiplying the AAFTEs from the increased retention times the tuition and state support per AAFTE. This estimate is probably conservative; the additional students on campus would generate other revenue via sports passes, recreation center passes, room and board, parking, and participation in other programs.
Estimated Increased AAFTEs | Annual Tuition per AAFTE | Annual State Funds per AAFTE | Total Revenue Increase | |
---|---|---|---|---|
Sophomores | 21 | $3,200 | $5,000 | $172,200 |
Juniors | 18 | $3,200 | $5,000 | $147,600 |
Seniors | 16 | $3,200 | $5,000 | $131,200 |
Total Estimated Revenue Increase | $451,000 |
Estimated Annual Costs
The final questions from this study included: Would $451,000 per year be enough to cover the costs of the Freshman Seminar Program? What kind of an ROI does the program generate? The annual Freshman Seminar budget was $125,000, which included stipends for participating faculty, payment of the graduate student facilitators and undergraduate peer facilitators, and enough to cover supplies and some equipment purchases. However, the salary of the coordinator was not included nor were the costs of direct supervisors, administrative support, or building and equipment cost estimates. Table 4.8 gives an overall summary of the estimated costs of the Freshman Seminar Program. Major direct costs were included in the estimates.
1) Annual Freshmen Seminar budget | $125,000 |
2) Administrative* | 57,302 |
3) Other staff support* | 12,192 |
4) Estimated room depreciation** | 7,961 |
5) Estimated equipment depreciation*** | 26,750 |
Total Estimated Annual FS Costs | $229,205 |
Questions for all ROI or cost studies focus on “when have the major costs been captured,” or “should indirect costs be allocated?” Extra weeks or months could have been spent on this project estimating how to allocate indirect costs. WSU’s experience with the Technology Costing Methodology (TCM) reinforced this decision not to estimate all indirect costs to “avoid the effort involved in allocating costs to obtain results that arc seldom of managerial utility” (Jones, 2001, p. 16). Tables 4.9, 4.10, and 4.11 show some examples of the detail behind the cost summary and allocation of some major indirect costs.
For many education cost estimates (such as the cost of a class or program) the “people costs” dominate, that is, the cost of salaries, wages, and benefits of the faculty and staff involved with the project. The following estimate includes 100% of the costs (not actual) of the coordinator of the Freshman Seminar Program, plus estimates for the portion of time spent on the Freshman Seminar by the coordinator’s direct supervisor, the associate vice president for educational development. The cost of administrative assistance is estimated in Table 4.9.
Coordinator 100% | Associate VP 10% | |
---|---|---|
Salary per pay period (not actual) | $1,500 | $3,800 |
Number of pay periods | 24 | 24 |
% allocable to Freshman Seminars | 100% | 10% |
Add benefits | 1.27 | 1.27 |
Salary and Benefit Costs to FS | $45,720 | $11,582 |
Other Staff Support Estimated as: | |
Two support personnel earning $24,000 per year each 27% benefits Estimated time spent on Freshman Seminars | $48,000 1.27 20% |
Estimated Staff Support | $12,192 |
Freshman Seminars meet in computer laboratories for most of their classes. For this study it was estimated that they used the rooms 75% of the time. Table 4.11 estimates the annual depreciation of equipment; Table 4.12 estimates the annual depreciation of the rooms used for classes/computer laboratories.
Estimated Return on Investment
The estimated return on investment for Freshman Seminars during the last one-year period was ($451,000 – $229,205) / $229,205 or 96.8% which is a very strong ROI. In one academic year, the Freshman Seminar Program at WSU generated almost twice as much revenue as it cost due to the students retained at WSU because of the program.
Estimated cost for 40 computers $ 100,000 |
Estimated cost for seven scanners and printers7,000 |
Total equipment cost $ 107,000 |
Assume three-year depreciable life$ 35,666.67 |
1) Original cost of Lighty | $15,872,186 |
2) Divide by Lighty square feet | 94,924 |
3) Gives cost per square foot | $167.21 |
4) Depreciation per year / 50 year life / sq. ft. | $3.34 |
5) Square footage of labs: 260z@l,303 260W@807, & 260F@l,064 | 3,174 |
6) Assume 75% use for Freshman Seminar | 75% |
7) Annual depreciation cost of labs (4x5x6) | $7,961 |
LESSONS LEARNED
Many higher education studies involving cost estimates (as does this ROI analysis) will note that people costs, salaries, wages, and benefits will dominate the costs of a program or unit being studied. Return on investment analysis gives an estimate of added revenues and costs from a program; good estimates if the analyst is careful, but not exact numbers. It is often necessary to explain this to faculty and staff as data is gathered for the analysis. ROI analysis is an effective, well-grounded analysis technique that can be used as a formative assessment tool to show the return of a program and highlight possible areas of improvement. This analysis technique may be a very valuable assessment tool to have as state funding for higher education decreases and as higher education institutions attempt to be more productive.
REFERENCES
- Baron, D. (2002). ASTD ROI network. Retrieved November 11, 2002, from http://roi.astd.org/index.aspx
- Henscheid, J. M. (1999). Washington State University freshman seminar program research findings. Retrieved November 17, 2002, from http://salc.wsu.edu/freshman/details/research_findings.htm
- Henscheid, J. M. (2001). Peer facilitators as lead freshman seminar instructors. In J. E. Miller, J. E., Groccia, & M. S. Miller (Eds.). Student-assisted teaching: A guide to faculty-student team work (pp. 21-26). Bolton, MA: Anker.
- Jones, D. (2001). Technology costing methodology project. Retrieved November 19, 2002, from http://www.wcet.info/projects/tcm/TCM_Handbook_Final.pdf
- Kirkpatrick, D. L. (1996). Evaluating training programs: The four levels. San Francisco, CA: Berrett-Koehler.
- Phillips, J. J. (1997a). Return on investment in training and performance improvement programs. Woburn, MA: Butterworth-Heinemann.
- Phillips, J. J. (1997b). Handbook of training evaluation and measurement methods. Houston, TX: Gulf Publishing.
- Phillips, J. J. (2000). The consultant’s scorecard: Tracking results and bottom-line impact of consulting projects. New York, NY: McGraw-Hill.
Contact:
Timothy W. Bothell
Brigham Young University
4432 WSC
Provo, UT 84602
Voice (801) 422-8194
Fax (801) 422-0223
Email [email protected]
Tom Henderson
Center for TLT
Washington State University
Box 644550
Pullman, WA 99164
Voice (509) 335-6451
Fax (509) 335-1362
Email [email protected]
Timothy W. Bothell is Faculty Development Coordinator for the Assessment of Student Learning at Brigham Young University. He currently conducts workshops and works with faculty one-on-one to improve the assessment of student learning. He also directs the Exam Improvement Center within Brigham Young University’s Faculty Center. Faculty from all colleges and departments can leave their exams at the Exam Improvement Center for feedback and suggestions. In addition, as an independent consultant, he consults organizations concerning the return on investment of learning.
Tom Henderson is Assessment Coordinator at the Center for Teaching, Learning, and Technology at Washington State University. He is a co-leader of a WSU team that is adapting the Western Cooperative for Educational Telecommunication’s Technology Costing Methodology (TCM) to assess the processes as well as the costs of WSU course development activities and adapt that information to the TCM/mini-Bridge cost simulation model. He has also field-tested the Flashlight Cost Model while analyzing the costs of course management technologies at WSU. He has over 12 years of experience in private sector accounting and financial analysis. He has a PhD in interdisciplinary studies from Washington State University, an MBA in finance from the University of Washington, and a B.S. in accounting from the University of Idaho.