Lukas Matejka
09/11/2025, 1:09 PMFanis Tharropoulos
09/11/2025, 1:56 PMexhaustive_search=false, the engine stops exploring as soon as it thinks it has enough good matches, to keep latency low. “Enough” is controlled by a couple of thresholds, which by default are very small, so it can stop early.
What “enough” means:
- For typo exploration, it stops once the number of results reaches typo_tokens_threshold, default 1.
- For dropping tokens from a multi word query, it stops once total results reach drop_tokens_threshold, default 1.
- If you use group_by, “results” means number of groups, not raw hits.
- It is also bounded by max_candidates, which limits how many correction or prefix candidates are tried per token, default about 10. A time limit search_cutoff_ms can also stop the search.
The first few exact or near exact paths produced some hits, the thresholds were met, so the search did not keep exploring deeper typo combinations, more token drops, or larger candidate sets.
How to get more without turning on exhaustive_search:
- Raise typo_tokens_threshold, for example to your page size like 10 or 20.
- Raise drop_tokens_threshold, for example 5 to 20 for multi word queries.
- Increase max_candidates, for example 50 to 200.
- Increase search_cutoff_ms if you see search_cutoff true in responses.
- Ensure prefixes and num_typos are set as you expect.
If you want maximum recall, set exhaustive_search=true. That removes most early stopping but will be slower.Lukas Matejka
09/12/2025, 6:26 AM