Enterprises are creating and storing more unstructured content than ever before, so when a litigation, compliance, or investigation event hits, it falls on the legal team to review and produce those documents. Unfortunately, reviewers feel the collateral damage of undisciplined information governance practices, and frequently have to look at thousands of random documents, often the same documents repeatedly, in eDiscovery review projects that last hundreds of hours. Coding a duplicate document multiple times across a huge review universe wastes precious review time and bores the human behind the screen.
Coding emails that are just snippets of threads or families is disengaging and obscures an understanding of the big picture. The new Review in Context feature in OpenText™ Axcelerate™ aims to change that by enabling the automated review of a story, instead of just a bunch of random documents. By leveraging document relationships like families, duplicates, near-duplicates, and email threads, Review in Context empowers users to better understand the case’s context and streamline legal review in one simple interface.
A better way to find related documents
The best eDiscovery software should minimize the effort required to organize related documents like threads, duplicates, and families. Axcelerate’s new Review in Context interface eliminates cumbersome tasks such as running email threading scripts or manually creating individual searches that contain duplicates or family.
Review in Context’s intuitive visual interface does it all—with no additional steps, time, or effort involved. Users can see duplicates, attachments, and email threads alongside the documents they’re reviewing. This ultimately makes reviewers aware of associated document sets and helps them review more efficiently and tag more consistently. Documents with family members or that are part of an email thread display together in review groups that are separated by grey bold lines. Parents documents are followed by their attachments, displayed as a paperclip.
Consistent coding across document relationships
Two documents that are exactly (or nearly) the same should be treated in a related way and thereby coded the same way. If one duplicate document is considered irrelevant to a case, its duplicates should be tagged not relevant, too. Review in Context guarantees consistent treatment across related documents such as duplicates, near-dupes, families, and email threads.
A tiny difference between two documents can make a huge impact on a legal case, so having automated tools and technology to help you spot those near-differences is imperative. Near-duplicates are documents that are almost identical but slightly different. Review in Context clearly indicates the difference between near-duplicates and duplicates – duplicates sorted together in the same batch set will display “D” and near-duplicates will display a “N.”
Reviewers can instantly tag documents that require the same coding in an entire group together. The tagging panel offers the option to automatically tag all unreviewed duplicate documents in the same review set. This option applies to tagging family or email thread members as well. Bulk tagging docs based off relationships saves reviewers time by not looking at unnecessary documents and having to manually review then individually tag related documents. Shaving even a couple minutes off every coding decision adds up countless hours saved and translates to drastic cost-savings in the lifespan of an eDiscovery project.
Quality assurance among reviewers
In addition to ensuring consistency, Review in Context can help case managers quality check and rapidly remediate decisions made by their review team for the upmost accuracy. Different reviewers may have varying opinions on what is relevant or privilege. While one reviewer may tag a doc relevant, a second will consider the duplicate document not relevant. Review in Context surfaces this contrary information to help case managers ensure standardization across a project and reviewers inform their decisions before tagging. Other times, a reviewer may just be incorrectly coding everything and needs to be alerted or booted from the project. The Duplicates (D) and Near Duplicates (N) fly-in show inconsistently tagged duplicates that need to be addressed by a case manager.
Fewer distractions for even greater results
Review in Context reduces reviewer distractions by showing only relevant metadata to a MIME type and intuitive visual icons. For instance, fields displayed for an email will not be the same as those visible for a Word file. Fields like “Sender Name” and “Recipient Count” appear for emails but not for Word files. Visual icons further help reviewers stay focused by better understanding their progress and duplicate relationships. An eye icon indicates that the document has been viewed but not tagged in review, whereas a check mark means that the document has been reviewed and tagged.