Nearly a year ago, we were tasked with the challenge of finding a new review platform for Vorys, Sater, Seymour and Pease LLP. I have described below the path we took in our evaluation and selection process. Our overall process was as follows:
1. Define where you are
2. Define where you want to be
3. Identify requirements
4. Identify potential solutions and conduct high level overview
5. Requests for information
6. Detailed reviews
7. Proof of concept
8. Select product
There are many ways to skin this cat depending on your needs. My intent here is to provide you with an insight into the process we followed, the products we evaluated and the product we chose as our solution. It is not my intent to suggest that our process or solution is right for you. I encourage others who have recently been down this road to comment or contribute.
First Things First
Before I tell you about our process for evaluating and selecting a tool, it will help to provide you with some background information. Vorys has 375 attorneys in 6 offices and our main office is located in Columbus, Ohio. Our litigation support team consists of me and a team of 6 highly skilled electronic discovery professionals (having a knowledgeable team who could devote time to this project was key to our selection process).
The last time we chose a review platform was 2001. We selected Summation iBlaze in 2001 and utilized it until approximately 2006 when we migrated to Summation Enterprise. We installed and operated 3 Summation Enterprise installations in our largest offices. We had 160 concurrent Summation licenses, approximately 600 active databases and over 15TB of data. Additionally we had 5 bundles of LAW 5.0 for e-discovery processing, Nuix Proof Finder, Clearwell and CaseMap.
Our litigation technology department handles the processing and hosting of most cases under 50GB but has processed and hosted cases over 100GB, where it makes sense. Most cases over 50GB are evaluated for outsourcing.
In addition to the technology, Vorys has invested a huge amount of resources developing specifications and workflows for all of our processes over the years. Having detailed workflows and specifications are key to managing electronic discovery in-house. This an important consideration when changing platforms since these will likely require an overhaul.
We assembled a project team and held our kickoff meeting in March, 2012. The project team consisted of our litigation technology team, our PMO, our Senior Manager of Service – Delivery in IT and our CIO.
How We Did It
Our number one objective was to find the product that best fit our requirements. We started by creating a spreadsheet with all of the functions and features we had with our current systems. (Fortunately we had checklists we used to test upgrades in the past which was a great start!) Additional state of the art features were also added to this list. Once the list was complete, we determined which of these functions and features were “must haves” vs. “nice to haves”. The spreadsheet followed our workflows and contained functionality and features for System Administration, Case Set-up, Pre-Processing Analysis, Processing/Analytics, Loading/Importing, Review, Production, Reporting/Metrics and Transcripts. The list contained over 225 functions and features. The final draft was reviewed and discussed with our users to ensure we covered all of their concerns including which functions and features were “must haves” from their perspective.
Next we compiled a list of products on the market (list was restricted to products that were currently in use with active cases at a referencable site) and set out to determine which ones were potential solutions for us. Because some of the products were all-in-ones, we were incented to re-evaluate our processing capabilities as well. The next step was to schedule 1 hour product demonstrations with everyone on our list. We included the following products in our initial review: Autonomy E-discovery*, Case Logistix, Lexis Products (Concordance Classic, Evolution, LAW EDA and TextMap)*, Digital Reef, Digital War Room*, Equivio, iConnect Xera, IPRO suite*, Kroll Verve*, Lateral Data’s Viewpoint*, Liquid Lit Manager*, NeedleFinder, Nextpoint, Nuix, Orcatech, Recommind*, Relativity*, Stored IQ, TunnelVision (Mindseye), Venio and Zylab*. Our evaluation led to the selection of 10 products for the next phase of our project (see * above).
We prepared a Request for Information (RFI) and sent it to these 10 in June. The RFI started with a summary of Vorys’ current set-up as described above, our objective for this project, and a forecast of data volumes and user counts over the next 3 years. This summary was followed by several pages of written questions covering things such as vendor background, software specifications and infrastructure requirements, proof of concept capabilities, licensing and pricing, training and support, installation, upgrades, product roadmaps and references. Additionally, we attached the feature/functionality list mentioned above, and asked the vendors to indicate which features/functions were in the current version of their product. We had calls with most, if not all, of the vendors to answer any questions they had. We also contacted references for each product and completed reference score cards. (By the way, thanks so much to all of you who served as references; it is amazing to see how willing our community is to share their experiences and assist their peers!) When the RFI’s came in, we reviewed the responses and compiled all of the data into a scorecard. Based on the RFI and reference scorecard results, we identified our top 6 in products in August, 2012. Our top 6 included: Autonomy E-discovery, IPRO Suite, kCura’s Relativity, Kroll Verve, Lateral Data’s Viewpoint and Recommind Axcelerate.
This was truly an outstanding group of products and I must say that the competition was stiff. It was really important for us to do all of the homework up front in identifying what our firm needed in order to make it to the next level.
Taking a Deep Dive
Our next phase involved detailed demonstrations (which we called deep dives) of the 6 products. Once again, we created scorecards for this phase. We had 2 primary areas to cover in the deep dives.
The first area we needed to address was to evaluate how each of these 6 systems could support some of our old image based repositories. In order to evaluate this, we sent the vendors some test data similar to the coding we had in these repositories. We asked the vendors to load the coding into their system and show us the results. We also asked them what, if any, issues they encountered in loading the data. As a result of this test, we were able to visualize how these systems would handle these repositories and discovered several issues we would have to address if we migrated.
The second area to address was the demonstration of the over 200 features and functionality we identified at the outset. The purpose of this phase was to confirm not only what each system could do, but how it was done. We asked the vendors in advance to go down the list and demonstrate them in order. These demonstrations were 4 hours per vendor, conducted over a 2 ½ week period. The entire litigation technology team was required to attend all 6 sessions and complete a scorecard. Once these demonstrations were complete, litigation technology compiled, tallied and discussed the results. We were now ready to identify the top 2 or 3 systems for a POC.
Proof of Concept
Our top 2 contenders for a POC were kCura’s Relativity and Lateral Data’s Viewpoint (a subsidiary of Conduent). We scheduled part of October, 2012 and all of November, 2012 to complete a POC. The litigation technology team was divided into two groups and assigned specific areas to test and score during the POC. Team 1 focused on case set-up, processing, loading, early data assessment and culling in both products. Team 2 focused on analytics, review, productions and reporting in both products.
We also set up demonstrations for our legal teams. We offered back-to-back ½ hour demonstrations of each product for those professionals wanting a high level demonstration. We also offered 2-hour back-to-back demonstrations of both products for the professionals who frequently worked in our current review platform. Both sessions were also recorded so the legal teams could view at their convenience. We solicited feedback from our users via email by simply asking which product they preferred and why.
The information from the POC and from our users was compiled and we made a decision.
Drum Roll Please…
We selected Lateral Data’s Viewpoint. Some of the top reasons we chose Viewpoint included:
• The all-in-one concept that Viewpoint offers bringing great efficiency to our workflow.
• The speed and simplicity of processing.
• The culling and analytics capabilities.
• Viewpoint Assisted Review is included in the product.
• Email threading and near-dupe analysis is automatically included.
• Productions (including native, tif or hybrids) are intuitive and straight forward.
• The overall user interface.
It has been a long and sometimes painful journey, but we aren’t done yet! We are currently determining if we will host Viewpoint with Conduent, or behind our firewall. Conduent has set up a test system for us at their facility and our Lit Tech team is busy developing workflows. We hope to be ready to roll when the installation is complete. We will begin with a pilot case and, with success, identify the go live date in Viewpoint. Additionally, we will create and execute a migration plan for our current cases. Finally, we are evaluating and will implement a solution for transcripts. I’m happy to share the details of our next leg of the journey if you think it will be helpful. Best of luck to any of you who are at the beginning or in the middle of this process!
Julie Brown is the Litigation Technology Executive Manager at Vorys, Sater, Seymour and Pease LLP.
About the Author