Verification

It is important to build checks on your processes and assumptions so you can be confident your system is working as intended and address any unforeseen issues when necessary. Internally, we have added forms of verification directly into our review process. In-house verification is one method, but working with an independent, third party is a valuable additional means of verification. Consider engaging third-party examinations to better evaluate the accuracy of your results. Methods of verification should focus on two areas: results and process.

Double Review

We are committed to the double-review process, particularly for copyright review projects operating at a large scale. This process requires two separate, independent reviewers to agree on the rights status of a work. If the two reviews do not agree, a third, expert reviewer adjudicates the two reviews and decides the most appropriate determination for the volume.

The double review is a form of verification that provides CRMS with a daily check on our determinations. We have a high degree of confidence in our results because each review is performed at least twice and conflicting reviews receive additional attention from an expert reviewer. This does not protect against any underlying flaws in our methods, but it helps prevent human error from having large-scale consequences.

The double-review process creates an additional cost in time and labor. We could approximately double our reviews with the same amount of reviewer labor if we migrated our process to a single review system, but we would lose the immediate check on our results and be concerned that errors might more easily creep into our determinations.

Copyright Review Verification

This verification process contemplates a future where reviewers at HathiTrust partners independently perform large-scale copyright review of volumes in the HathiTrust corpus. For example, the University of Wisconsin may wish to contribute copyright determinations for ten thousand works published in Ireland prior to 1945. In order for those reviews to be ingested into HathiTrust, they must be acceptable to HathiTrust’s legal counsel (currently the Office of General Counsel at the University of Michigan). A verification process can give counsel a degree of confidence in the reliability of a project’s results.

There are two stages to the verification process, outlined in the next section.

Preproject Verification

Preproject verification would include a review of all project documentation for the proposed project, feedback on project design if necessary, and a recommendation to approve or deny approval of the project based on the legal assumptions and project planning documents submitted for review. Legal expertise is essential at this stage, but a focus on process is equally important.

The verification process should focus on any flawed legal assumptions in the project, problematic project design choices, or any other errors that could undermine project results. If errors are identified in the preproject stage, applicants should be given time to address them and submit revised project documentation.

Stage 1: Process Verification

The following questions are relevant to the design of the review project and can serve as a foundation for your inquiry.

Legal

  1. What has the project team identified as relevant copyright durations for the following types of works?
    1. Known author
    2. Known (multiple) authors
    3. Unknown/anonymous author(s)
    4. Works published posthumously
    5. Corporate authors
    6. Government works
    7. Unpublished works
  2. Does the project account for the presence of third-party materials in volumes being reviewed? Document the reason or justification for accounting for—or choosing not to account for—third-party materials. This will affect decision making and the review process.
  3. Are copyright duration calculations appropriately cited and verified?
  4. What legal resources were used in developing the project plan and decision trees? Does the project’s legal analysis and workflow correspond appropriately with the legal resources cited?
  5. Is the review interface code a reliable translation of the project’s legal analysis? (Note that ideally a second programmer would be available to confirm the accuracy of the code.)

Procedural

  1. Foreign language expertise may be necessary to collect facts relevant to a copyright determination. Do reviewers for the project have adequate language expertise to perform the reviews? Is any other expertise required by this project?
  2. Is a double review part of the project plan? If no, what is the justification for a single review?
  3. Has the project team developed a decision tree to guide copyright determinations? Is it practical? Is it legally accurate?
  4. What changes, if any, are recommended before this project moves forward?
  5. Does the team recommend that the project commence reviews, based on the planning documents submitted?

Stage 2: Results Verification

The second stage should be a third-party verification of randomly sampled results, drawn from the project’s volumes reviewed to date. This independent review should be designed to verify that the copyright determinations produced by the project are accurate and consistent with the previously approved project documentation. This review should be performed at an early stage in the project so that any errors can be identified and the review process can be modified when necessary.

All identified errors must be corrected, as well as any consistent patterns of error that are discovered through the verification. For example, if a particular author was misidentified, all volumes tied to that author should be re-reviewed. If narrow, easily fixed errors account for the error rate, no new check will be required after these errors have been corrected.

If the errors represent patterns that might have a broad impact on the rest of the candidate pool, the project will need to conduct a re-review of some percentage of the candidate pool. The re-review should focus on the source of the errors, whether due to human error, flawed legal assumptions, application code, or problems related to the review process. The re-review should be performed as narrowly as is reasonable, given the error, and at its conclusion, a new random verification sample should be generated.