To Reduce Risk and Increase Efficiency in Investigations and Litigation, Data is Key
July 10, 2023
By:
Handling large volumes of data during an investigation or litigation can be anxiety-inducing for legal teams. Corporate datasets can become a minefield of sensitive, privileged, and proprietary information that legal teams must identify as quickly as possible in order to mitigate risk. Ironically, corporate data also provides a key to speeding up and improving this process. By reusing metadata and work product from past matters in combination with advanced analytics, organizations can significantly reduce risk and increase efficiency during the review process.
In a recent episode of Law & Candor, I discussed the complex nature of corporate data and ways in which the work done on past matters—coupled with analytics and advanced review tools—can be reused and leveraged to reduce risk and increase efficiency for current and future matters. Here are my key takeaways from the conversation.
From burden to asset: leveraging data and analytics to gain the advantage
The evolution of analytical tools and technologies continues to change the data landscape for litigation and investigations. In complex matters especially—think multi-district litigation, second requests, large multi-year projects with multiple review streams—the technology and analytics that can now be applied to find responsive data not only helps streamline the review process but can extend corporate knowledge beyond a single matter for a larger purpose. Companies can now use their data to their advantage, transforming it from a liability into an asset.
Prior to standardization around threading and TAR and CAL workflows, repository models were the norm. Re-use of issue coding was the best way to gain efficiency, but each matter still began with a clean slate. Now, with more sophisticated analytics, it’s not just coding and work product that can be re-used. The full analysis that went into making coding decisions can be applied to other matters so that the knowledge gained from a review and from the data itself is not lost as new matters come along. This results in greater overall efficiencies—not to mention major cost-savings—over time.
Enhanced tools and analytics reduce the risk of PII, privilege, and other sensitive data exposure
With today’s data volumes, the more traditional methods used in review, such as search terms and regular expression (regex), can often result in high data recall with low precision. That is, such a wide net is cast that a lot of data is captured that isn’t terribly significant, and data that does matter can be missed.
Analytical modeling can help avoid that pitfall by leveraging prior work product and coding to reduce the size of the data population from the outset, sometimes by as much as 90%, and to help find information that more traditional tools often miss.
This is especially impactful when it comes to PII, PHI, and privileged or other sensitive data that may be in the population, because the risk of exposure is significantly reduced as accuracy increases.
Upfront costs may seem like a barrier, but downstream cost savings in review make up for it
When technology and data analytics are used to reduce data volume from the beginning, efficiencies are gained throughout the entire review process; there are exponential gains moving forward in terms of both speed and cost. Unfortunately, the upfront costs may seem steep to the uninitiated, presenting what is the likely barrier to the lack of wide adoption of many advanced technologies. The initial outlay before a project even begins can be perceived as a challenge for eDiscovery cost centers.
Also, it can be very difficult for any company to keep up with the rapid evolution of both the complex data landscape and the analytics tools available to address it—the options can seem overwhelming. Finding the right technology partner with both expertise and experience in the appropriate analytics tools and workflows is crucial for making the transition to a more effective approach.
A good partner should be able to understand the needs of your company and provide the necessary statistics to support and justify a change. A proof-of-concept exercise is a way to provide compelling evidence that any up-front expenditure will more than justify a revised workflow that will exponentially reduce costs of linear document review.
How to get started
Seeing is believing, as they say, and the best way to demonstrate that something works is to see it in action. A proof-of-concept exercise with a real use case—run side-by-side with the existing process—is an effective way to highlight the efficiencies gained by applying the appropriate analytics tools in the right places. A good consulting partner, especially one familiar with the company’s data landscape, should be able to design such a test to show that the downstream cost savings will justify the up-front spend, not just for a single matter, but for other matters as well.
Cross-matter analysis and analytics: the new frontier
TAR and CAL workflows, which are finally finding wider use, should be the first line of exploration for companies not yet well-versed in how these workflows can optimize efficiency. But that is just the beginning. Advanced analytics tools add an additional level of robustness that can put those workflows into overdrive. Cross-matter analysis and analytics, for example, can address important questions: How can companies use the knowledge and work product gleaned from prior matters and apply them to current and future matters? How can such knowledge be pooled and leveraged, in conjunction with AI or other machine learning tools, to create models that will be applicable to future efforts?
Marrying the old school data repository concept with new analytics tools is opening a new world of possibilities that we’re just beginning to explore. It’s a new frontier, and the most intrepid explorers will be the ones that reap the greatest benefits.
For more information on data reuse and other review strategies, check out our review solutions page.