Getting Started

How to Use This Toolkit

This toolkit is divided into three main parts. It is primarily designed for copyright review of books, but it is also useful for a range of copyright review activities. The first part of the toolkit consists of a series of preplanning documents, one or more of which can be used in early-stage project meetings to build your team and plan your approach when faced with key questions. These documents are meant to help you decide who will be doing the work for your copyright review project and how they will be doing that work. Specifically, the preplanning section should help you

  • assemble the team that you will be working with to perform copyright determinations
  • identify the candidate volumes that you will be reviewing
  • define your review process, workflow, and your project’s desired outcomes
  • build the case for your project to senior administrators

The second part of the toolkit dives deeper into the practical considerations facing a copyright review project, including project leadership, the legal fundamentals for copyright review, technical elements, and observations related to project personnel. We document many of the lessons learned over our years of CRMS activity and hope you will find this resource useful.

Please note that before proceeding with this toolkit, you may be inclined to skim over the glossary, where we define key terms that will appear throughout the text.

The third part of the toolkit includes reports on pilot projects and a series of appendices. Together these form valuable documentation from the project. The pilot project reports detail discrete subprojects we explored through CRMS over the past several years. They are meant to provide a sense of both the opportunities and limitations of copyright review projects at scale. Topics covered include our experience piloting Spanish-language reviews, our efforts to improve name authority records (a useful by-product of our copyright review activity), and the expansion of CRMS activities to copyright-notice–based review of US state government documents. The appendices provide project resources that can serve as models or be repurposed for future projects.

Finally, we want this toolkit to be helpful, but we also aim to inspire a measure of caution. Copyright review, especially at scale, is challenging, and we want to be unambiguous about the difficulties associated with this work. If you are going to go down this path, we urge you to spend substantial time planning, to consider every tool and question we have identified in the preplanning portion of this toolkit, and to pilot your project before fully committing to a particular course of review activity. Your early-stage planning will pay substantial dividends over time.

Preplanning Document 1: Building Your Team

CRMS evolved into a large-scale review project with nineteen partner institutions and more than sixty reviewers. Significant staff time was required for training and overseeing the work of those reviewers, as well as managing administrative requirements related to system security, access to digital scans, ongoing project documentation, and grant-based cost-share paperwork. The division of labor outlined in this document reflects the scale of CRMS. This document outlines five roles and recommends a minimum team of seven for larger projects. Your preplanning team should include a project manager and legal expert at the earliest stages, with additional roles added as the project develops. Smaller scale projects may be able to blend these roles and work with a smaller team. However, if your project grows in scale, it is important to consider the impact of that growth on staff resources.

1. Project Manager

Role Description

The project manager has overall responsibility for the project. The project manager is a liaison with HathiTrust (or other institutional administration) and ensures that formal requirements of the project are met and well documented. The project manager also works with the other team members to ensure that all component parts of the project are operating effectively.

Key Considerations
  • If working with HathiTrust, who on your team organizes the documentation required to facilitate reviewer access to digital scans, troubleshoots access as needed, and renews access on a regular basis?
  • What documentation (monthly reports, project related memos, training materials) does your project require, and who is responsible for maintaining and archiving this documentation?
  • Are there cost-share requirements or other financial reporting requirements for your grant? If yes, who is the liaison with partner institutions, ensuring that all relevant documents are collected and reported properly?
Additional Notes

Large-scale projects—especially multi-institution, grant-funded projects working with HathiTrust security protocols—generate significant, ongoing administrative work. Managing and accounting for work and documentation for cost-share commitments is complex. (For example, participants must understand if grants require that cost-share commitments are accounted for in dollar value of labor in contrast to effort/time alone.) Your team needs to consider this workload when planning.

2. Legal Expert

Role Description

The legal expert researches and identifies the legal considerations relevant to the project, then works with the project team to design the review process. The legal expert also oversees project development to ensure that it conforms to current law.

Key Considerations
  • Does your project team have a dedicated copyright expert?
  • What is the copyright expert’s relationship with your institution’s office of general counsel?
  • Is the copyright expert’s legal expertise sufficient for your proposed review project, or does your expert need to consult with others? If outside expertise is required, have you identified potential advisors?
  • Do you have access to outside copyright expertise or oversight from an advisory group?
  • Has one or more outside copyright experts verified your copyright review plan?
  • After your project has started, how will you address new or unforeseen legal questions not covered in your initial planning documents?
Additional Notes

Copyright review projects present some legal risk, so your office of general counsel or equivalent should be made aware of your project and approve of your methods and workflow.

3. Developer

Role Description

The developer builds and maintains the online review interface, translates the legal framework into algorithms, adds new tools when available, and adapts and updates the system as needed. A dedicated developer is ideal, but some percentage of a developer’s time is a minimum requirement for the duration of any rights research project relying on an online interface.

Key Considerations
  • Are you using an online interface to manage all reviews?
  • Have you consulted with a developer to anticipate future needs, based on your project’s duration and potential evolution? What project changes, if any, do you anticipate over time?
  • Who maintains the interface if software changes impede its operation?
  • Who troubleshoots for you if the system goes down? How does system downtime affect the rest of your project plan?
  • Have you identified a full-time or part-time developer who can dedicate considerable time to your project as needed?
  • Has your developer reviewed the requirements for a copyright review management system as detailed in the technical section?
Additional Notes

The CRMS project relies on the CRMS online interface detailed in the technical section of the toolkit. The interface required consistent development over time—new project tools emerged, outside changes (to HathiTrust or web browsers, for example) necessitated corresponding changes to the interface, and we explored new projects that also required adaptations of the interface.

4. Training and Reviewer Manager (Quality Control)

Role Description

Training and reviewer management are ongoing activities for large-scale review projects. Your project team should include at least one member focused on training reviewers and maintaining consistency in project execution.

Key Considerations
  • Does your team have at least one point-person for communicating with and answering questions from reviewers? Who sets workflow policy as needs arise?
  • Are your reviewers held to any performance standards requiring oversight?
  • Do you provide ongoing training as needed or primarily at the beginning of the project?
  • Do you anticipate reviewer turnover during the course of your project? How do you bring on new reviewers?
  • Do you have a plan for communicating with and updating all reviewers on any necessary changes?
  • How do you document those changes over time in a way that reviewers and managers can reference and understand if they join the project after it has started?
  • What training and assessment tools (i.e., video conferencing for remote reviewers, online quizzes, reviewer performance metrics) are available to your project team?
Additional Notes

If you have a small group of reviewers with little anticipated turnover, your project may require less oversight. Your project will require more consistent oversight and ongoing opportunities for reviewer training if you anticipate managing a growing number of reviewers over time, if reviewer turnover is expected on a regular basis, or if the project is relatively complex.

5. Copyright Reviewers

Role Description

The number of copyright reviewers will vary depending on the scale of your project. They perform the day-to-day copyright reviews, working directly with your project’s candidate volumes and rendering copyright determinations for those volumes.

Key Considerations
  • How many reviewers work on the project? What is their time commitment? What is their hourly rate (dollar value of time committed based on salary) for accounting and cost-share purposes, if required?
  • Do reviewers possess the language skills necessary to review the candidate pool?
  • How do you add new reviewers to the project? Are reviewers removed from the project if they fail to meet certain objective requirements? When and how would you conduct such assessments?
  • Do you have a set timeline for completing reviews? Is this timeline reasonable, given the number of reviewers and an approximation of the time required to review the types of volumes in your candidate pool?
  • Have you identified expert reviewers (reviewers who can resolve conflicts in your review queue)? (A conflict occurs when two reviews for the same volume do not match.)
Additional Notes

Regardless of project scale, we recommend a minimum of three reviewers for any copyright review project, to allow for double review (see “Double Review” section).

Preplanning Document 2: Building Your Project

This set of questions is meant to help as you design your copyright review project. These questions may overlap with the previous preplanning document in this toolkit. Here they are framed within the context of the project, rather than by individual team roles. To better understand these questions, your project team should consult the body of the CRMS toolkit. Before undertaking a large-scale copyright review project, each of the following questions should be carefully considered and addressed.

Institutional Commitment

  1. Does your institution’s leadership understand the goals and risks of your project?
  2. Has your institution’s leadership approved your project?
  3. Is your project funded and/or is staff time dedicated specifically to copyright review?
  4. Is your institution’s general counsel aware of your project and supportive?
  5. Do you have access to a legal advisor familiar with copyright law?

Project Design

  1. What is the primary goal of your project (e.g., identifying public domain volumes, collecting copyright-relevant information about volumes in your collection)?
  2. What is the scope of your copyright review?
    1. Are you reviewing books or some other kind of material, such as serials, sound recordings, or other media? Are you reviewing only one type of material or multiple types?
    2. What is the date range?
    3. Which countries of publication are involved? Are you targeting only one country or multiple countries?
    4. What languages are used in the material to be reviewed?
    5. Are there other particular features of the proposed collection that would have bearing on copyright determinations (e.g., publication status, contested or ambiguous applicable law)?
  3. What scope of access do you intend to provide to volumes you have reviewed (e.g., institution only, US-based access, worldwide access)?
  4. Are you concerned about duplicative activity? Have you verified that the volumes you plan on reviewing are not already freely available online?
  5. If another copyright review project has reviewed similar volumes, what can you learn about their process to help improve your own reviews? Will you choose to accept their determinations, and how will you document that decision?
  6. Have you identified the information you need to collect in order to make copyright determinations for your project (e.g., author death dates, US copyright renewal research)?
  7. If you are basing your determinations on author death dates, have you identified the research tools (e.g., New General Catalog of Old Books & Authors [NGCOBA], Virtual International Authority File) you need to collect copyright-relevant information? If you are basing your copyright review on formalities, what tools do you plan on using (e.g., Stanford Copyright Renewal Database, Catalog of Copyright Entries, other)? (Note that the Stanford database consists almost exclusively of renewal records for books.)
  8. What is your project timeline? Is it based on the number of volumes to be reviewed, institutional demands, or some other metric? Is it reasonable?

Data Collection

  1. For volumes currently in copyright, are you collecting data sufficient for predicting when those volumes may enter the public domain?
  2. Do your data collection methods consider future collection management and digitization decisions? For example, could your project easily identify authors whose works are likely to be in the public domain and then digitize accordingly?
  3. Have you identified elements of bibliographic metadata that are likely to be useful for future searches and may be relevant for improving catalog records? Do you have a plan for encouraging reviewers to record these metadata in a consistent and uniform manner that will facilitate database search and retrieval?

Legal

  1. What legal resources and personnel will you use to map out your copyright review process?
  2. Have you identified a legal advisor who can provide feedback on your copyright review plan?
  3. Are you basing your copyright review on past US copyright formalities (i.e., renewal and/or copyright notice)?
  4. Have you accounted for copyright restoration in the United States due to the Uruguay Round Agreements Act (URAA), embodied in 17 U.S.C. § 104A?
  5. If you are reviewing non-US publications, what resources and expertise will you draw on to understand the copyright laws of the relevant countries?
  6. Are there categories of works that your project defines as unpublished? How do you make the determination that the works are unpublished? How does your project plan to determine the copyright status for these unpublished works?
  7. How will your project approach possible third-party authored content (inserts) within the volumes you review?
  8. What facts (or lack of facts) will lead your reviewers to an “undetermined/need further investigation” determination for a given review?
  9. If your project plans to make digital copies of volumes available as a result of your review, do you have a notice and takedown procedure in place?
  10. Have you discussed this project with your institution’s general counsel?

Project Management

  1. How many reviewers will participate in your review project? Are they centrally located, or are they geographically dispersed?
  2. How much time will each reviewer commit to the project per week?
  3. What is the management structure of your review project?
  4. Who will oversee reviewers? How will the project manager define expectations and monitor reviewers’ accuracy and productivity levels? How will their issues be addressed?
  5. Do reviewers have access to dedicated terminals in a secure, nonpublic area? Are they equipped with wide-screen monitors appropriate for reviewing digital scans of volumes?
  6. How will you recognize and celebrate the contributions of the reviewers to the project?
  7. What channels will you use to report and promote the progress of the project?

Training

  1. Will you consider adding new reviewers over time? If yes, who will train new reviewers?
  2. Does your training plan include a “sandbox,” where reviewers can practice on predetermined volumes?
  3. What training materials and methods will you employ when bringing new reviewers onboard?
  4. Do you have a performance threshold, below which reviewers will be retrained or removed from the project?
  5. Do your training materials encourage uniformity and consistency in note-taking, especially for metadata terms that may be useful for searching the project database and making improvements to bibliographic metadata?

Process

  1. Will your project employ a double-review system or will one reviewer’s conclusion be determinative?
  2. Do you have decision trees to guide reviewer behavior? Have you developed any other tools to help reviewers navigate the review process?
  3. What is the full range of copyright determinations that can be made in your system? “Public domain”? “In copyright”? What else?
  4. Are you using a “review interface” to make and track your determinations or are you using spreadsheets to perform this work?

Technical Considerations

  1. Have you identified developer resources to support your project? Has your institution committed a dedicated developer to your project?
  2. Has your institution committed the computational resources to serve a Web-based review interface and the database infrastructure to store review data? If stored data is lost, can it be restored from backup?
  3. Can your institution guarantee a reasonable amount of system uptime to allow reviewers to work free of interruption? Does your institution have support staff that can respond to an outage quickly?
  4. Does your institution have the security infrastructure to prevent unauthorized access to the system and the scans?

Verification

  1. Do you have quality control methods built into your process, like a double-review system?
  2. Will you work with a third party to independently check a given number of your results? If yes, what is your procedure for an external check?
  3. If an external check provides useful information related to your review process, what is your plan for integrating that information into your process?

Funding

  1. How is your project work being funded?
  2. If your work is funded through a grant, what are the reporting requirements of the grant? What documentation do you need to collect? What are the important grant deadlines that your team members need to be aware of?
  3. If your work is funded through a multipartner cost-share grant, can your partners maintain the cost-share commitment if key project personnel depart?
  4. Does your institution have a plan for sustaining the work after the end of the grant period?
  5. What are the long-term costs for sustaining your review project?

Preplanning Document 3: CRMS Project Decision Points

This list is meant to guide new project planners through the key decision points for their copyright review project. Over the years, we have found that the following questions must be addressed when undertaking copyright review of books at scale. Planning how your project team intends to treat categories of work (e.g., translations, dissertations, dictionaries) will help you allocate reviewer resources more effectively and understand the research tools you will need to reach a determination.

Please describe in detail how your project will treat the following copyright-related issues:

Foreign Language/Script

How will your reviewers work with volumes in foreign languages? Does your project have a mechanism for referring foreign language volumes to a reviewer with the relevant language proficiency, or will your project disregard foreign language volumes?

Inserts

Do you expect your reviewers to look for the presence of third-party authored materials in volumes they review? If so, how much scrutiny do you expect your reviewers to apply? How will your reviewers treat the presence of third-party authored materials incorporated into a volume being reviewed? What does or does not count as an insert?

Translations

When a work is identified as a translation (or contains translations), what guidance do you provide reviewers?

Dissertation/Thesis

Will your review project treat dissertations or theses differently from other published works? In what ways will you treat them differently?

Periodicals

If your project team will review periodicals, how will you identify third-party authored content in the periodicals? What assumptions are you making regarding works made for hire?

Non–Class A Works (United States)

Most books published in the United States between 1923 and 1963 are referred to as “Class A” works by the US Copyright Office. Renewal records for these books can be searched in the Stanford Copyright Renewal Database. Non–Class A works include serials, artwork, photographs, screenplays, and works prepared for oral delivery. We have found that renewal records for non–Class A works are harder to research due to the absence of a resource like the Stanford Copyright Renewal Database. If your project is based on the presence or absence of a copyright renewal for US works, will you extend your project to non–Class A works? If yes, how do you intend to do this?

Editions

Does your project address the possibility of variable copyright terms for multiple editions of a work?

Government Works

What guidance will you provide to reviewers for identifying a government work, such as Crown copyright for Commonwealth countries?

Author-Based Determinations

For projects that base copyright determinations on the death date of the author of the work (as opposed to formalities, including US copyright renewal and notice requirements), how will your project treat the following categories of works?

  • Known author
  • Known (multiple) authors
  • Uncertain or conflicting death dates for known authors
  • Unknown/anonymous author(s)
  • Corporate authors
  • Government works
  • Unpublished works