#community-help

Troubleshooting a High Volume Search Response

TLDR Sophie experienced oversized search responses despite setting a lower limit. Jason suggested checking the results count and limiting the return values in each document, which resolved the issue.

Powered by Struct AI

1

Sep 23, 2023 (2 months ago)
Sophie
Photo of md5-afe120ebd1cdfa6388f44a8b4fcbc33c
Sophie
08:56 PM
hey, i'm getting a 5mb response with some hundreds of results when i've set the limit to much lower values. I've tried different limit parameters from different versions of the docs, and checked that the request is being sent through correctly with wireshark in each case. any ideas as to what to do next to figure this out?
Jason
Photo of md5-8813087cccc512313602b6d9f9ece19f
Jason
08:56 PM
Could you share the full search request with all the search parameters youโ€™re using?
Sophie
Photo of md5-afe120ebd1cdfa6388f44a8b4fcbc33c
Sophie
08:58 PM
yep, here's the most recent go!

<http://jammy:8108/collections/pdfs/documents/search?q=pdf&amp;query_by=content&amp;per_page=10>
08:58
Sophie
08:58 PM
ooh, silly query there, now that i look at it again โ€“ but the q value should work fine with this collection anyway
Jason
Photo of md5-8813087cccc512313602b6d9f9ece19f
Jason
09:00 PM
Could you do something like this:

curl &lt;your search request&gt; | jq '.hits | length'
Sophie
Photo of md5-afe120ebd1cdfa6388f44a8b4fcbc33c
Sophie
09:07 PM
yea one sec
09:10
Sophie
09:10 PM
ah, it's ten results
09:10
Sophie
09:10 PM
ok
09:10
Sophie
09:10 PM
should've thought to try that!
09:10
Sophie
09:10 PM
thanks
09:10
Sophie
09:10 PM
so how do i limit the size of the returned output? seems it is returning entire document contents likely?
Jason
Photo of md5-8813087cccc512313602b6d9f9ece19f
Jason
09:10 PM
You can use include_fields or exclude_fields to return particular values in each document
Sophie
Photo of md5-afe120ebd1cdfa6388f44a8b4fcbc33c
Sophie
09:11 PM
cool thx

1

09:11
Sophie
09:11 PM
eeee, works perfect immediately! thank you so much.
Jason
Photo of md5-8813087cccc512313602b6d9f9ece19f
Jason
09:12 PM
Happy to help!

Typesense

Lightning-fast, open source search engine for everyone | Knowledge Base powered by Struct.AI

Indexed 3015 threads (79% resolved)

Join Our Community

Similar Threads

Optimizing Uploads to Typesense and Dealing with Hits Limit

Masahiro asked how to upload documents to Typesense faster and about their search feature limit_hits. Jason suggested batching documents and debugged the search query. Issue was resolved.

1

9
33mo

Integrating Semantic Search with Typesense

Krish wants to integrate a semantic search functionality with typesense but struggles with the limitations. Kishore Nallan provides resources, clarifications and workarounds to the raised issues.

6

75
11mo

Increase Search Result Size and Filter Special Characters

Anton requested pagination beyond 250 hits per page and precise filtering for terms with special characters. Jason suggested multi-search as a workaround and planned to address special character filtering in an upcoming release. A Github issue was created to track this feature request.

3

7
32mo

Issues and Improvements in Typesense with 14 Million Records

Miguel experienced performance issues when using Typesense for large datasets. Jason suggested performance improvements made to Typesense since then and directed them to specific server-side parameters for better handling. Miguel agreed to try again.

2

21
19mo

Enhancing Vector Search Performance and Response Time using Multi-Search Feature

Bill faced performance issues with vector search using multi_search feature. Jason and Kishore Nallan suggested running models on a GPU and excluding large fields from the search. Through discussion, it was established that adding more CPUs and enabling server-side caching could enhance performance. The thread concluded with the user reaching a resolution.

3

140
1mo