Error Building Search App - Timeout Issue with Data Import
TLDR Zaiste faced an error while following a search app building tutorial due to data set size. He suggested a batch import method and sent a PR to the document. Jason acknowledged the issue and thanked Zaiste for the PR.
Nov 15, 2021 (26 months ago)
Request #1637009712744: Request to Node 0 failed due to "ECONNABORTED timeout of 2000ms exceeded"and the populate script freezes while some data is added. The
booksdata set has
9979items; when I try with a smaller set of just few books it works without problem. How should I initialise the client to avoid this error?
.import()method along with the
batchoption. I made a suggestion to the docs: https://github.com/typesense/typesense-website/pull/108/files
In any case, that example that reads 1 line at a time and imports it into Typesense one by one was written before we added the bulk import feature. I've been meaning to update that example with the import method, so thank you for the PR!
Nov 16, 2021 (26 months ago)
0.21.0I believe it just overflows the Typesense with these singular inserts
Indexed 3011 threads (79% resolved)
Revisiting Typesense for Efficient DB Indexing and Querying
kopach experienced slow indexing and crashes with Typesense. The community suggested to use batch import and check the server's resources. Improvements were made but additional support was needed for special characters and multi-search queries.
Troubleshooting Indexing Duration in Typesense Import
Alan asked about lengthy indexing times for importing documents to Typesense. Jason suggested various potential causes, including network connectivity and system resources. They later identified the problem to be an error in Alan's code.
Troubleshooting Write Timeouts in Typesense with Large CSVs
Agustin had issues with Typesense getting write timeouts while loading large CSV files. Kishore Nallan suggested chunking data or converting to JSONL before loading. Through troubleshooting, they identified a possible network problem at AWS and found a workaround.