After a decade and a half wandering on the left and the right of the EDRM, I have heard every assumption or myth under the sun with regard to managed review. Many of the beliefs about managed review are so ingrained that no one even bothers to question them — however, the industry has grown and evolved substantially over the last 15 years and some of the “facts” we take for granted are simply not true.
Let’s unpack the most prevalent myths of managed review and highlight the technology and processes reshaping our understanding of the human side of ediscovery.
Myth: Doc reviewers can only go through 55 docs per hour
I can pretty much guarantee you have heard the magic number 55: Managed review providers base their estimates in terms of cost and timing off of a review rate of 55 documents per hour.
When linear review was the only option (scrolling through documents without any overarching prioritization or organization), 55 documents per hour was a fair assessment. Today, there are a number of tools and processes to accelerate that review rate, from threading to full predictive coding.
Using best-of-breed workflows and advanced technology including continuous active learning (CAL), DISCO is able to bump average reviewed documents per hour to 88, drastically reducing review time and cost.
Be sure you understand the tools, process and technology a provider is employing before agreeing to the 55-document default.
Myth: You need a large data set for TAR
The earliest iterations of technology-assisted review (TAR) required iterative rounds of training and large heterogeneous data sets to have the best results. Now, with tools leveraging deep learning and CAL, practitioners begin gaining insight after only the first few documents coded, and we have seen substantial benefits for data sets of only a few thousand.
CAL and analytics create massive efficiency in review speed and, more importantly, time to insight. For even a small investigation or preliminary first peek at data CAL reduces time to evidence.
Myth: You need a big review team to go through lots of data
When I began my career, one of the first large matters I worked on (Intel v. AMD) employed north of 500 attorneys for years. While this was an anomaly, it was not uncommon for a second request, large investigation, or multinational litigation to have teams this large.
Advanced technology (like CAL), more efficient culling, and precise data collections are all contributing to a reduction in the number of people necessary to conduct a timely and efficient review. Today, teams of just a few attorneys can parse even the most intimidating data volumes.
Be sure that your managed review partner is making the most of technological advances impacting review speed.
Myth: Contract attorneys are fungible
There used to be a belief among many people who staffed and managed large review teams that a contract attorney was a contract attorney was a contract attorney. And, for quite a while, the only true differences between reviewers was how easy they were to work with and their review speed and accuracy. But times, they are a-changin’.
With the sophistication of deep learning and CAL-based analytics and the increasingly complex and varied data sources in reviews today, a reviewer’s technological expertise and subject matter familiarity are important. A well-trained reviewer who understands the case, the data types, and the best practices for making the most of their platform yields substantially higher review rates and is able to uncover relevant information. DISCO takes advantage of this by rolling top attorneys from matter to matter and training them in terms of workflow and technology capabilities. The results are impressive.
Myth: Review PMs are babysitters
While review attorneys were seen as a warm body, their project managers were not regarded much more favorably. In the early days, a review PM‘s main job was to ensure decorum in the review room, collect timesheets, gather questions for the case team, and sit in the room to make sure folks were not surfing the net.
Today, with the influx of advanced business intelligence and metrics, new technology-optimized workflows, and substantive expertise, a PM serves as a major catalyst for efficiency and reducing time to insight.
Ensure that you understand what you are getting when you pay for PM time. Some companies will offer PMs at lower rates, but these lower cost resources will not offer the same review-optimizing expertise and workflow management. Qualified PMs will work with the case team to apply appropriate, defensible, and optimized review workflows. They will continually apply quality control checks and train the review team to best utilize the best technology.
Myth: A good PM just checks attendance and does QC
Many people undervalued the contribution of project management to the efficiency of a review because of a belief that PMs were glorified admins and QC.
The reality is that today’s PMs offer a lot more value, driving insights from applying advanced technology optimized workflows, providing reporting and insights on key metrics, and serving as a key part of the case team.
Myth: Supervision isn’t that big a deal
There is a persistent belief that the work performed by the document reviewers is also fungible. Far too many law firms simply wait for the review to be complete or substantially complete before looking at the coding decisions made by the review team.
The most successful reviews ensure there is plenty of communication between the knowledgeable associate teams and the document review teams. The least successful reviews try to replace that supervision of work with search terms. Blair and Maron was published 34 years ago — but some firms still try to use search terms instead of working directly with their document review teams.
Courts are becoming increasingly particular about the sufficiency of privilege search terms in document review and the definition of “inadvertent waiver.” It seems clear that courts want law firms to be involved with their document reviewers and not simply waiting around for the final product.
Myth: Reviewers perform the same regardless of platform
Review platforms are often treated as completely interchangeable, with folks assuming that the tool used will not have a material impact on the review accuracy, speed to insight, or review rate. The reality is that tools vary greatly in functionality, computational power, and speed of review. For example, DISCO Review displays document and search results in less than a second. Additionally, the analytics, application of AI, and advanced workflows offered greatly impact the reviewer speed and accuracy.
It is important to understand how the platform you are engaged with works, what efficiencies it offers, and any latency you can expect in page-loading or search speed. These factor into review cost and speed.
Myth: QC is a scam to increase billable hours
When reviewing a managed review bid, QC rates of 10-20% are not uncommon to see. Some people try to negotiate this volume of time down because it is seen as redundant and irrelevant. However, unless you want to deal with a clawback or inadvertent production of privileged documents, the slightly increased cost of an appropriate volume of QC is absolutely worth it.
Myth: Brute force is the best option
Some old-school practitioners are hesitant to rely on technology-optimized reviews, believing that regardless of the data volume, eyes on every document is still the best approach.
This approach is no longer feasible given the rapidly expanding data volumes, even in small matters. The truth is that a solution that marries technology and well-trained personnel will yield substantially more cost-effective, timely, and accurate results.
Myth: When in doubt, throw more bodies and money at the problem
When faced with a massive data volume, short turnaround, or messy data set, many practitioners turned to spending more money to throw more bodies at the review. The reality today is that many cases that are simply too big to take this approach with. In my previous life, there were some cases where budget was limitless but the constraints of time and the need to quickly understand the data set rendered this approach impossible. As one former jurist told me, the need for technical competence is growing and we will someday get to the point when choosing to be a Luddite or only embracing linear review is viewed as ethically questionable.
Managed review remains the largest cost in the growing ediscovery market and the long-held beliefs that it can only be done a certain way and we can only expect the same old levels of accuracy and speed are no longer true. There are better ways to approach managed review and much better results can be achieved.