Blog

Taking the Time to Get it Right… DiscoverReady Team Holds Summit on Quality in Document Review

Year-end is always a busy time. In addition to keeping up with day-to-day client demands, it’s time for financial and operational planning for the coming year.  In our case, mix in a heavy dose of integration work — combining the best of DiscoverReady and our new partners at ACT Litigation — and you’ve got yourself a barn burner.

Amid this frenzy of activity, some of our busiest senior-level team members from both organizations recently took time from their schedules to step back and strategically address one of our industry’s most critical issues — the topic of quality in document review.

As we’ve explored time and again, assessing quality or “accuracy” in the context of document review is a complex topic. If a document is marked responsive and one partner on the case agrees with the decision and another partner disagrees, which one is right? In those instances, is the right answer that there’s no right answer? Separately, if a document is marked as responsive and three issue tags are applied, is the document decision wrong if a fourth tag is missing?

These are not trick questions. And the obvious answers may not be the right answers, especially as our industry is trying to sort through the “accuracy” of human review versus the accuracy of automated review (aka predictive coding).

I’m very proud that over the years our combined organizations have been on the forefront of addressing both “what constitutes quality” and “how to generate quality” in document review. And as much as DiscoverReady and ACT have done separately on this subject, I’m even more excited about the work to come from our newly unified team.

Quality in Document Review

In the coming weeks and months, Macyl Burke, our Valencia-based guru on quality in e-discovery, and David Shub, our senior-most legal contributor to the topic of achieving quality in document review, will be working hard to provide a practical guide to defining and achieving quality in document review – human or automated.

The first charge for these gentlemen, and a multitude of colleagues working behind the scenes, is to lay the foundation for a common understanding of “defining and assessing quality in document review” that can be accepted throughout our industry.

From an internal perspective, this team also is responsible for creating a roadmap of best practices for our combined organization to follow going forward. These practices will incorporate the best of planning, process, automation and sampling, to ensure that we continue delivering the highest levels of quality and defensibility in document review.

Stay tuned for more.

 

Maureen O'Neill