Elicit extracts data from 200 million papers into a comparison table and that changes how you do a literature review
I want to describe a specific task rather than a general overview because I think the practical detail is more useful.
I was conducting a systematic review on a clinical intervention. The standard process: search databases, screen abstracts, pull full texts, extract the relevant data points from each paper, tabulate them for comparison. On a review with 60 included papers that data extraction step is weeks of work if you are doing it manually.
Elicit automated most of the extraction step. I defined the data I needed from each paper, the population, the intervention details, the outcome measures, the sample size, the follow-up period. For each paper in my set it extracted those fields automatically and populated a comparison table. The accuracy was not 100% and every row needed verification, but working from a pre-populated table that I was checking and correcting was dramatically faster than building the table from scratch.
The search function across 200 million-plus academic articles meant I found papers I had missed in my initial database searches. The one-sentence summaries for each result let me screen quickly without reading every abstract in full.
Upload and Analyze for my own PDFs extended the same extraction capability to papers I had already collected outside of Elicit's database.
Citation export to Zotero kept everything in my reference manager rather than creating a separate workflow.