Why I don’t read AI-generated summaries
Don’t give me a TLDR. Give me something worth my time.
One of the most common uses of LLMs is summarization. Toss in an hour-long YouTube video, a 5,000-word blog post, or a 2-hour meeting transcript—and out comes a neat little summary.
But I don’t know about you—I’ve never found these summaries useful. Why not?
1. AI doesn’t know what I care about.
Summaries aren’t objective. A good summary is always subjective—it changes depending on who’s reading it.
When I watch interviews with people like Sam Altman or Dario Amodei, I’m focused on how model capabilities can translate into real-world products. A researcher might care more about model architecture. Someone in AI safety might zoom in on governance and risk.
Without knowing what I care about, AI can only generate a generic list of bullet points—like a table of contents from a boring textbook.
That’s why many new AI note-taking tools now offer audience-specific templates: the same meeting gets summarized differently for engineers vs. product teams.
2. Speed isn’t always the goal.
Summaries are meant to save time. But when I consume long-form content, I’m not always trying to be efficient—I’m doing it for enjoyment, insight, and depth.
I don’t want just a synopsis. I want to experience the full arc, tone, pacing, and personality of the original.
For example, I recently watched an interview with the founder of Granola where he talked about how application-layer startups should focus on high-frequency, vertical use cases, because users are likely to turn to horizontal tools like ChatGPT for low-frequency use cases. That single insight stuck with me and influenced how I think about AI product strategy.
The AI-generated summary version? “The co-founders believed that specialized tools are more effective than building foundational models.”
Yawn. That’s not just bland—it completely misses the point.
Andrej Karpathy’s 3.5-hour “Deep Dive into LLMs like ChatGPT” finally helped me understand how these models work, as a non-technical person. Not because of the structure, but because of how he explains things: metaphors, diagrams, accessible language. None of that comes through in a summary.
People say attention spans are shrinking—but I think being able to finish a good hour-long video is a real differentiator.
The cult of TL;DR
We live in a world obsessed with TLDR: “Too long, didn’t read/watch.”
But relying on AI-generated summaries is like reading SparkNotes and pretending you’ve read the novel.
Come on—we’re not in school anymore. No one’s testing us. I consume content because I want to.
Still, there’s too much content and too little time. So where can AI help?
What we need isn’t summarization. It’s curation.
I don’t need AI to rewrite or condense the content. I need it to help me find the content worth consuming in the first place.
I want an agent that knows my taste. One that scours the entire web on my behalf and says:
This 3.5-hour Karpathy video is worth watching in full—don’t skip a second.
There’s a 1-hour Sam Altman talk. Based on your interest in AI products, I’d recommend 20:21–31:52 where he covers OpenAI’s product roadmap. I’ll summarize the rest so you can decide if it’s worth your time.
That’s useful.
From algorithms to agents
Mobile-era recommendation algorithms were statistical guessers. They optimized for engagement, not meaning. And increasingly, I don’t trust them—they’re good at giving me brain-rot, not value.
But now that AI can understand language—and not just crunch numbers—we have a shot at something better.
Personal agents that get what I care about. That surface content not just because others liked it, but because I will.
We don’t need more summaries.
We need better curators.
As someone who uses tldr lately a lot, i totally get where you’re confused from. I love the idea of curation so that you get the specific timestamps where you get to ‘land’ snd still can get the summary of the rest!
I can get onboard with this. And not I want this future. As good as the YouTube algorithm is, I still find periods of boredom when it suggests the same type of video over and over. Or scouring through different articles and books, etc.
As much as I try to do a good job curating, it is still a big time investment.