Luke Hill
02/27/2025, 3:43 PM0
to 100_000_000
and in our UI we use a range slider to allow the user to select what prices they would like to see.
We then form a filter like so filter_by: price:[100..500000]
as an example.
We have about 3.5 million records and on average this search takes 1 second to complete.
Our current ideas are to try all of the following:
• Adding ranged_index
to the schema
◦ We’ve done this for part of our data set and so far we are only seeing a very small improvement
• Increase RAM
• Move to run in a VM
• Move to Typesense cloud
and if all of that doesn’t help, then we have thought about trying to turn it into a string based filter with a set of ranges, so something priced at 10
would be assigned priceFilter: '0'
(range 0 to 100) and something priced at 12000
would get priceFilter: '10000'
(range 10_000 to 20_000). We would then turn this into a filter like so filter_by: (priceFilter:='0' || priceFilter:='100' || priceFilter:='1000' || priceFilter:='10000')
We would like to avoid doing this though as it would be a huge undertaking to apply this new priceFilter
to all 3.5 million records.
If there is a better way or something that we are missing I would really appreciate some advice. Thanks in advance.