MedTech Summit 2021 Recap: How DistillerSR has empowered market leaders Philips and Alcon to achieve faster and more accurate literature reviews for CER submissions.

Close-up of Glucose Monitor

At our session at MedTech Summit 2021, we were joined by Sara Garbin, Senior Clinical Development Scientist at Philips and Alison Ramsey, Director/Head, Clinical Evaluations at Alcon for an insightful discussion about how leveraging automation can improve manual processes for literature reviews and lead to faster and more accurate CER submissions.

Q: How do you use literature reviews as part of meeting EU-MDR requirements?

Alison: We are using lit reviews in a number of steps throughout our medical devices development cycle. The most frequent instance is to fulfill CER safety & performance requirements, looking at the outcomes for medical devices currently going through clinical trials, identifying unreported or not previously known risks and benefits. We also use literature reviews to establish the state-of-the-art (SOTA) and to support a new indication for an existing medical device. Lastly, we conduct literature reviews to write the clinical background section for our CERs.

Sara: Literature reviews are very heavily woven into our clinical evaluation process. Literature reviews become incredibly important when you’re dealing with a high-risk medical device when it’s not feasible (or ethical) to test it in human subjects and there’s abundant research out there for an equivalent device. That means we use literature reviews to check for safety and performance related to EU-MDR requirements. We also use them to establish the state-of-the-art (SOTA) and ensure that our devices are in line with what should be used in the market today.

Q: How are you handling the current volume of clinical evidence?

Sara: The only way to survive and thrive in this information overload environment is to come up with more efficient workflows. You have to start streamlining things, and cutting down on manual processes.

Alison: We have approached this question from several different viewpoints. We hired more people, our team has grown from 5 to 25 reviewers but more importantly, we have had to address our internal processes. That was what really made a significant impact on our bottom line.

Before employing automation for literature reviews, the process was time-consuming and laborious. First, we would retrieve the full-text articles from databases such as PubMed and print them out. Then, we would read through our stack of papers and manually highlight important portions, take notes on the margin of each sheet of paper, and use post-it tabs to mark important articles. After that, we would write up our exclusion codes list, and separate all the references into piles according to that list. Once that was done, we would build excel spreadsheets for the included articles and the excluded articles. From there, we would go through our analysis and appraisal process, put it all into a report, and send that out for review. At this point, invariably, someone was going to say: ”I think this article should not have been included” or the other way around. That would force you back to the drawing board and you’d find yourself moving things around in these spreadsheets, from one table to another table. Once you started doing that, you significantly increased the risk for human error and the chance for inaccurate results by the end of the literature review.

Implementing DistillerSR has enabled us to go through the literature very quickly and very efficiently. We’ve gone paperless - we haven’t printed anything out since. It’s been a fantastic tool to streamline our literature review process. Whenever we get a large volume that comes back for review, DistillerSR allows us to sort out the inclusion and exclusion criteria very easily and clearly.

Before DistillerSR, our team of reviewers would have been able to complete approximately 15 literature reviews over the course of 12 months. Since implementing DistillerSR and increasing headcount, we have increased the number of literature reviews we are able to complete to 100 per year. Leveraging automation and intelligent workflows throughout the review lifecycle translated into significant time and cost savings.

Sara: We went through a similar process. We were putting our appraisals into spreadsheets, then exporting references from the various databases into a word document, going through them and making comments on that document for exclusion/inclusion reasons. The process became incredibly error-prone and incredibly time-consuming because if you accidentally move a reference around and the comment doesn’t get moved with it, you are suddenly left with loose comments that are not linked to a particular reference, and the end result can be very messy. We use global templates throughout our process, such as the PRISMA diagram and it was an awful exercise to do it manually. You are just at the mercy of human nature: each person phrases things differently and there is no easy way to search and filter consistently, especially if you have multiple reviewers working on a single project.

Going through the appraisal process manually, you don’t realize the mental burden of reviewing hundreds of references in a spreadsheet continuously. When you’re working in the DistillerSR interface, everything you need is in one screen, and the information gets compiled automatically, based on your answers for each reference. As a result, the whole process has become much easier and less stressful. We have saved quite a bit of time just by implementing DistillerSR and by revisiting some of the more manual processes.

DistillerSR has really been incredible. We ran an internal assessment and found that there was a 50% reduction in the time required to complete our literature reviews, which translates into time savings of 4 working days per literature review. Our team now gets 4 extra days to focus on the research, rather than the repetitive process of going through references.

"Going through the appraisal process manually, you don’t realize the mental burden of reviewing hundreds of references in a spreadsheet continuously. When you’re working in the DistillerSR interface, everything you need is in one screen, and the information gets compiled automatically, based on your answers for each reference. As a result, the whole process has become much easier and less stressful. We have saved quite a bit of time just by implementing DistillerSR and by revisiting some of the more manual processes."

(Sara Garbin, Senior Clinical Development Scientist, Philips)

Q: How easy was it for your organization to adopt a tool such as DistillerSR for your literature review process?

Alison: DistillerSR was very easy for our team to adopt. We were introduced to it by a third-party vendor. And from the first time I saw the software working, I never looked back. Everyone has been very receptive to the tool, especially if they had been used to doing literature reviews manually. More importantly, EP’s customer success team worked tirelessly to ensure that the product was configured for our needs.

Sara: When DistillerSR was proposed as “a better way to do literature reviews”, I volunteered to see if it was something that could make our lives easier. No matter what, I knew it was going to be better than using a spreadsheet. I’ve been working with it for 2 years now and it’s been really great. Even in the best-case scenario, there will always be some level of resistance. In our case, the majority of the people working on clinical evaluations were tenured individuals who had been doing literature reviews the same way - manually - for a long time. They felt intimidated by a new software solution and by a potentially steep learning curve, affecting their (already) tight deadlines. But the resistance disappeared once they put in the time to understand the system and the inherited benefits. Anybody who we hired at Philips as part of the clinical evaluations team since the implementation of DistillerSR has had to accept it as the new standard, and they have been extremely pleased with the process and the results.

Q: Have the current work arrangements, with distributed/hybrid teams, changed your lives at all?

Sara: We don’t have the option to grab another writer/reviewer and get in a room and talk things through anymore so everything needs to be done remotely. Ultimately, we had to find ways to accomplish that. DistillerSR has been critical to ensure continuity and transparency while working with multiple reviewers on a single project and managing several projects simultaneously.

Alison: Our team has always been remote, which literally means we are always on, 24 hours a day throughout the work week. DistillerSR allows for seamless collaboration between team members in different time zones, where one team member can finish their work day and another team member can pick up the work at exactly that point and start their day. And as a project manager, you can track their progress on a dashboard and ensure continuity.

 

Divider2-100

 

Marc Dufresne
About the Author

Marc Dufresne

For more than 20 years, Marc has led results-oriented teams and initiatives to consistently exceed targets with an emphasis on building long-term relationships. At Evidence Partners, he has been instrumental in accelerating the adoption of DistillerSR by identifying new market opportunities. Marc was an OBJ Top 40 under 40 recipient in 2008 and holds a BA from Laurentian University.

Stay in Touch With Our Quarterly Newsletter

Related Posts

 
Join us on October 19th, 2021 for
Evidence Matters 21 Wordmark
A virtual one-day summit for the literature review community