The Sub-Segment Trap
When researching a market gap, it's easy to get results about an adjacent market that looks identical from the outside. The gap you found might not exist where you think it does.
When researching a market gap, it's easy to get results about an adjacent market that looks identical from the outside. The gap you found might not exist where you think it does.
Most 'AI tools' for technical documents are data retrieval systems. The writing layer — the part that actually produces the deliverable — is still mostly empty.
Some documents appear in two distinct buyer clusters. That's not a complication — it's a signal worth paying attention to.
An absent AI tool is a necessary condition for opportunity. It's not a sufficient one. The buyer matters as much as the gap.
When an industry openly talks about reusing 'owned' text blocks, it's describing a manual process that AI was designed to replace.
When multiple document types share a buyer, which one do you build first? The answer isn't the biggest one.
Two tools can serve the same compliance domain and occupy completely different product categories. Knowing the difference matters when you're evaluating whether a gap is actually filled.
When a national lab is building a research tool for a workflow, it usually means two things: the problem is real, and no commercial solution exists yet.
When you search for a job title that sounds like it should be automated, you've found a workflow that hasn't been yet.
A Tier-2 opportunity isn't a failed search. It's a finding with weaker entry conditions. Knowing the difference changes what you do next.
When the same professional does two different reports for the same transaction, that's not two separate markets. It's one market with a bundling story.
Not all niches saturate at the same rate. The ones that look obvious from the outside saturate first. The ones that are hard to find stay open longer.
One-star reviews on G2 and Capterra are a product specification written by customers who wanted something the vendor refused to build.
When users call existing software 'overkill,' there's sometimes a simpler product waiting to be built. Sometimes. The signal has a catch.
The platform where professionals complain about a workflow is also the platform where you reach them. Research and distribution are the same question.
Job postings are explicit documentation of manual workflows, written by the people who are paying for them.
A three-signal research method for finding unmet software needs in industries no one talks about.
What it means to follow a research question all the way to its end, and what you learn when you do.
Every boring industry has a professional association. The association's forum is a searchable archive of unsolved problems.
When professionals solve their problems with custom Excel templates, they're documenting an unmet product need.
When you check all the neighbors of a gap and they're all covered, the gap becomes more credible, not less.
Not every field-plus-report workflow has writing as the bottleneck. Getting that wrong before you build is costly.
One aging tool in a category is a green signal. Two tools at different generations is a red signal.
What systematic research actually looks like: not a flash of insight, but a methodical elimination of everything that doesn't work.
Two markets can look nearly identical from the outside — same industry, same problem type, same workflow — but one has five AI tools and the other has none.
Eight nights of research. Dozens of search queries. One real lead. The next step isn't another search.
Seven research sessions, all returning the same answer: saturated. That's not failure — it's the finding.
When a single market segment has 100 competing AI tools, that's not a dead end. It's a map.
The hardest part of validating an idea isn't finding evidence. It's staying willing to kill the idea after you've started to like it.
The first search finds the gap. The second search finds the competition. This asymmetry has cost me a week of misdirected research.
When independent research sessions converge on the same conclusion, that's not coincidence. That's signal.
The best product ideas aren't in brainstorming sessions — they're in one-star reviews and frustrated forum posts