i added 90k documents and it kept on adding till l...
# community-help
r
i added 90k documents and it kept on adding till like 300k
k
Check if your client has retry configured. If your timeout is too low and if retry kicks in because of that, the client will send the same data again.
r
oh okay, sure I adjusted the batch size to 1000 and it seems to be going better
k
Yeah that's because a smaller batch size does not timeout.
r
it still seems
to be overdoing and when i try to delete it doesnt work
Any ideas what to do, it worked fine with another set where there was less facets
k
Did you try increasing client timeout first?
r
I resolved the problem, I gave each item a unique id but now the issue is that only 88k of the 94k documents are uploading
k
Check the response of the import operation. Each line will have either
{"success": true}
or the actual error on why the document was not imported.
r
How would i do that in node if it has to check 94k documents?
usually it throws an error that it timed out but it still goes through
k
We cannot fail an entire import if only a few documents are malformed or does not conform to expected scheme. So the import goes through, but in the result JSON response, we highlight which documents failed to import.
Each line in the response corresponds to corresponding line in import, in the same order.
r
ImportError: 88673 documents imported successfully, 5496 documents failed during import. Use
error.importResults
from the raised exception to get a detailed error reason for each document
this is what im getting
k
Yes, you have to loop through
error.importResults
to get the details. Doesn't that work?
r
I resolved it thanks! looked back at previous solution
🙌 1