Understanding Data Storage in Typesense
TLDR Ethan wanted information on how to index large amounts of data. Jason guided that Typesense is for secondary data storage and all data for search results must be in Typesense.
1
Jan 11, 2023 (11 months ago)
Ethan
10:05 PMJason
10:07 PMEthan
10:08 PMJason
10:09 PMJason
10:10 PMJason
10:11 PMEthan
10:11 PMJason
10:11 PMEthan
10:11 PM1
Jason
10:12 PMEthan
10:13 PMJason
10:14 PMThis is not possible with Typesense. You would have to put at least all the data you want to surface in search results in Typesense
Ethan
10:14 PMJason
10:14 PMTypesense
Indexed 3015 threads (79% resolved)
Similar Threads
Understanding Typesense Indexing and Memory Usage
Ed inquires about pros and cons of indexing in typesense. Kishore Nallan and Jason explain the purpose and benefits of Typesense as a secondary data store and how to optimize memory usage.
Troubleshooting Typesense Document Import Error
Christopher had trouble importing 2.1M documents into Typesense due to memory errors. Jason clarified the system requirements, explaining the correlation between RAM and dataset size, and ways to tackle the issue. They both also discussed database-like query options.
Using Typesense to Index Large Amounts of Data
Rafael wants to use Typesense to index 100M documents currently in MongoDBAtlas. Jason affirmed Typesense can handle it and asked for more details.
Discussing Document Indexing Speeds and Typesense Features
Thomas asks about the speed of indexing and associated factors. The conversation reveals that larger batch sizes and NVMe disk usage can improve speed, but the index size is limited by RAM. Jason shares plans on supporting nested fields, and they explore a solution for products in multiple categories and catalogs.
Can Typesense be Disk-based Instead of RAM-based?
Xavi questioned if Typesense could be disk-based. Jason clarified that Typesense does not have a disk-only mode and explained index size.