Now I Get It: Science Simplified for Everyone
Science CommunicationAI-Generated ContentInformation LiteracyAcademic PublishingInteractive Web Tools
Article: PositiveCommunity: PositiveMixed
Now I Get It! is a web-based tool that converts scientific research papers into easy-to-understand interactive pages. Users upload PDFs to receive a plain-language explanation that is both shareable and accessible. The site also hosts a gallery of summaries ranging from AI breakthroughs to medical studies.
Key Points
- Translates dense scientific PDFs into interactive, plain-language web pages.
- Provides an automated workflow from file upload to web publication.
- Features a public gallery of summarized research across diverse scientific fields.
- Aims to make specialized academic knowledge shareable and understandable for a general audience.
Sentiment
The community was broadly positive and engaged, with many expressing willingness to pay and offering constructive feature suggestions. However, there were meaningful concerns about output quality, hallucinations, and whether the tool adds sufficient value over using LLMs directly. The creator's transparency and responsiveness contributed to a warm reception overall.
In Agreement
- Making scientific papers accessible through interactive web pages is a compelling use case for AI, especially for laypeople and non-specialists
- The tool serves as a useful on-ramp for triaging papers when you have more to read than time allows
- Several users confirmed the generated summaries were accurate when checked against their domain knowledge
- The app is valuable for sharing research with non-technical friends, family, or teammates who need to get on the same page quickly
- The concept of translating papers into software rather than just summarizing them represents an interesting creative direction
Opposed
- The generated output is far from the quality of hand-crafted interactive explanations like those from distill.pub, ciechanow.ski, or Red Blob Games
- LLM-generated pages have a recognizable AI aesthetic that some find off-putting, resembling buzzword-filled tech landing pages
- Hallucinations and factual errors in the generated content undermine trust and educational value
- The same results can be achieved with a simple prompt to any frontier LLM, questioning the value of a wrapper application
- There is no way to verify whether users actually understand the material better versus just feeling like they do