Data Modelling and Update Limitations for Typesense
TLDR Taj inquired about optimal data modeling for Typesense and potential limits to document updates. Jason recommended creating separate collections and addressed limitations based on various factors. Rate limiting options were also discussed.
3
1
Aug 16, 2023 (3 months ago)
Taj
08:12 PMJason
08:19 PM1
Taj
08:21 PMTaj
08:22 PMJason
08:22 PMYes, but it depends on the number of CPU cores you have, the size of each document and the total number of documents in the collection.
For eg, I’ve been able to update 2.2M documents, with 5-6 fields each, in about 4 minutes on a 4vCPU server
1
Jason
08:23 PM1
Taj
08:25 PM1
Typesense
Indexed 3015 threads (79% resolved)
Similar Threads
Discussing Typesense Cloud's SSDs, NVMe, and Resources Needed
A asked about Typesense's storage type and configuration possibilities. Jason shared that they use SSDs and suggested NVMe SSDs for high-availability instances. They discussed server resources needed for specific user cases and briefly touched on DDoS protection via Cloudflare.
Discussions on Typesense, Collections, and Dynamic Fields
Tugay shares plans to use Typesense for their SaaS platform and asks about collection sizes and sharding. Jason clarifies Typesense's capabilities and shares a beta feature. They discuss using unique collections per customer and new improvements. Kishore Nallan and Gabe comment on threading and data protection respectively.
Troubleshooting Typesense Document Import Error
Christopher had trouble importing 2.1M documents into Typesense due to memory errors. Jason clarified the system requirements, explaining the correlation between RAM and dataset size, and ways to tackle the issue. They both also discussed database-like query options.