#community-help

Optimizing Uploads to Typesense and Dealing with Hits Limit

TLDR Masahiro asked how to upload documents to Typesense faster and about their search feature limit_hits. Jason suggested batching documents and debugged the search query. Issue was resolved.

Powered by Struct AI

1

9
31mo
Solved
Join the chat
Apr 25, 2021 (31 months ago)
Masahiro
Photo of md5-366dff6b5f9b1a7d0f404fdc3261e573
Masahiro
06:18 AM
Hi,
I have 3 questions.
1. Faster way to upload documents to Typesense. (I’m okay with uploading speed now, but if possible I wanna make it faster.)
2. advantages over meilisearch (Typesense cloud and unlimited filters?) https://typesense.org/docs/overview/comparison-with-alternatives.html#typesense-vs-meilisearch
3. limit_hits did not work. search request retrieves all documents.
I’d appreciate it if you could answer my questions😄
Jason
Photo of md5-8813087cccc512313602b6d9f9ece19f
Jason
08:42 PM
Masahiro 1. The fastest way to import documents is through the documents/import endpoint and ingesting a large batch of documents at once (assuming the server has sufficient capacity to handle the volume). I've been able to import 2M records, in 3 mins, importing 10K docs in one import call
08:44
Jason
08:44 PM
2. Looks like you found the comparison table! Besides the feature set, the biggest highlight I'd say is that Typesense can run in a clustered mode, whereas Meilisearch can only run on a single node and does not support a clustered configuration, which makes it a potential single point of failure in a production setting. So it's not production-ready yet.
08:45
Jason
08:45 PM
3. Could you share the exact search query you're using with limit_hits?
Apr 26, 2021 (31 months ago)
Masahiro
Photo of md5-366dff6b5f9b1a7d0f404fdc3261e573
Masahiro
12:18 AM
1. thanks! I will add this tips to documentation later. and besides, Do you have any plans to add FAQ section? It will help people to find answers more quickly!
2. wow-cool. Wny not mention in the table? 😆
3. This is the query I did
const searchRequests = {
    searches: [
      {
        collection: 'users',
        q: `${ids}`,
        query_by: 'userId',
        limit_hits: 10,
        page: 1,
        per_page: 10
      }
    ]
  };
  try {
    const result = await client.multiSearch.perform(searchRequests);
    console.log(result);
    // console.log(result['results'][0]['hits']);
  } catch (e) {
    cons
Jason
Photo of md5-8813087cccc512313602b6d9f9ece19f
Jason
02:26 AM
1. No plans yet, but may be we should add one! May be to the overview section? Feel free to add it, and I can then also add a few questions that have come up in the past.
2. It's mentioned very lightly in the table as "Fault Tolerance" . I'll update it to make it clearer.
3. What happens when you change page to 2 in the same query?

1

Masahiro
Photo of md5-366dff6b5f9b1a7d0f404fdc3261e573
Masahiro
04:56 AM
because limit_hits: was set to 10, query failed.
so, I tried with this query, but did not work.
{
collection: ‘users’,
q: ${ids},
query_by: ‘userId’,
page: 2,
per_page: 10
}
05:11
Masahiro
05:11 AM
query found 450 documents,but hits were 10.
thanks it solved!!
Jason
Photo of md5-8813087cccc512313602b6d9f9ece19f
Jason
05:21 AM
Yup, it will still return the number of documents found, but it won't let you paginate through them based on limit_hits