Lessons in Technology-Assisted Review Transparency

March 11, 2013 Stuart LaRosa

How should you negotiate with your adversary when considering the use of technology-assisted review (TAR)? What should your disclosure obligations be? And should a party agree to them, or move on to another method? These are all questions counsel should be prepared to address prior to utilizing TAR.

There’s no doubt that basic transparency about the process is a good thing. In Da Silva Moore v. Publicis Groupe, U.S. Magistrate Judge Andrew J. Peck noted that “transparency allows the opposing side to be more comfortable with computer-assisted review and reduces fears about the so-called black box of the technology. This court highly recommends that counsel in future cases be willing to at least discuss, if not agree to, such transparency in the computer-assisted review process.” To that end, Judge Peck recommended that counsel collaborate in designing “an appropriate process, including use of available technology, with appropriate quality control testing, to review and process relevant ESI while adhering to Rule 1 and Rule 26(b)(2)(C) proportionality.”

The Case Management Order: Protocol Relating to the Production of Electronically Stored Information (“ESI”) entered in In re Actos (Pioglitazone) Products Liability Litigation serves as a detailed example of the transparent, cooperative approach Judge Peck envisioned. In In re Actos, U.S. District Judge Rebecca Doherty approved the parties’ TAR protocol, which reflected the parties’ agreement in a number of areas: the sources of ESI, the key custodians, the TAR platform, the format for production, and the methodology of the review. Each party also agreed to appoint three “experts to work collaboratively to train the TAR software” by reviewing a sample population of documents that would be used to train the technology engine. The defendant’s experts were allowed to preview the sample set to remove privileged documents; the plaintiff’s experts signed a nondisclosure agreement to protect against any inadvertent disclosure. In addition, both sides reviewed a sample of the documents that the TAR engine found irrelevant to check its accuracy. The parties also agreed to set a relevance threshold, above which the defendants would review all documents prior to production.

Does this mean that counsel hoping to use TAR should confer early with opposing counsel to determine any concerns and try to allay them by collaborating to create a cooperative, transparent process? There are a number of options counsel can consider if opting to head down this path:

Confer upfront?

First, some parties may choose to work together in developing the seed set of documents used to train the TAR algorithm, as in In re Actos. In these cases, parties would share the coding of each iteration of sample sets used to train the system to avoid later concerns that the process was inaccurate or inconsistent or, as Judge Peck noted in Da Silva Moore, to avoid claims that the computer was not trained properly.

Another option is to collaborate with opposing counsel on the back end of the TAR process, once results are generated. In this scenario, parties would collaboratively review a randomly drawn sample of documents in the population, typically documents with scores below the relevance threshold that would not be included in the review set. Parties also could find it worthwhile to review a sample of the documents that were withheld as irrelevant although they also contained certain relevant keywords to confirm the accuracy of the search terms.

Does Actos set a precedent for use of experts?

No, but there’s no doubt that experts can play an important role in solidifying the defensibility of the producing party’s sampling and training processes, providing “A Peek Behind the Technology-Assisted Review Curtain.” For instance, early in the process, statisticians can guide parties in choosing the appropriate sampling techniques and ensure that the sample seed set is drawn correctly and appropriately sized; later, they can affirm that the results are valid. Similarly, linguists can help parties construct searches to guarantee the inclusion of responsive documents within the seed sets and verify that the parties did not overlook important related terms or variants.

Regardless of which approach producing parties take, they should take steps to protect themselves against the inadvertent disclosure of privileged, confidential, or proprietary information. Producing parties may choose to enter a nondisclosure agreement or protective order to safeguard their confidential information. They might also propose a clawback agreement; if so, they should also ask the court to enter an order memorializing the agreement to maximize the protections that Federal Rule of Evidence 502 affords. Agreements and orders such as these are vitally important, especially when producing parties choose to skip manual review of any documents marked responsive.

To evaluate the benefits and potential risks of each approach to the cooperative use of TAR, counsel should consult with seasoned e-discovery experts.

Stuart LaRosa is a senior search consultant with Conduent. He can be reached at slarosa@conduent.com..

About the Author


Previous Article
Sampling Made Simple: Sampling Fundamentals in Technology-Assisted Review

Regardless of the general method being employed in a large-scale document review—keyword cull with linear m...

Next Article
A Peek Behind the Technology-Assisted Review Curtain

It’s right in the name: technology-assisted review.  Yet some seem to be under the impression (or, as sales...