Hello ! Are there known issues causing queries to ...
# community-help
y
Hello ! Are there known issues causing queries to run extremely slow (~30s on 1.5M records) when filtering on numerical values ? e.g.
'filter_by'='a:>=10 && a:<=20'
in v27
k
Use range syntax:
Copy code
a: [10..20]
If you want faster range queries, you can also enable
range_index
as a field property. This index will consume additional memory but will speed up range queries.
y
I will try, thanks ! Is this new in v27 ?
k
No, it has always existed. We need to automatically convert
a:>=10 && a:<=20
format to this, but it's not yet implemented and is on our todo list.
f
Currently working on a guide for filtering, I'll link to the documentation as soon as we have it finished!
🙏 1
y
ok thanks, the [x..y] syntax seems to work very well in my case, it's weird that I noticed these extremely long response times only after migrating from v0.24 to v27
Actually, the syntax [x..y] was not sufficient to solve the high response times. Trying the
range_index
flag on the field, but I'm still really intrigued by the fact that those types of range filter became a problem when migrating from 0.24.1 to v27 😕 @Kishore Nallan any idea why ?
k
Actually just yesterday we narrowed down a regression in the v27 release. Can you please try on
28.0.rc6
? This contains a fix. We will be doing a 27.1 release this week.
y
sure, thanks for the update
j
Same for me. but why i cannot set range_index on float? [ 'name' => 'location_lat', 'type' => 'float', 'range_index' => true, 'facet' => false ], [ 'name' => 'location_lng', 'type' => 'float', 'range_index' => true, 'facet' => false ],
Or the simpler question. is this performance issue solved in 27.1 and can be used in prod? 🙂
I can confirm that the bug is fixed in 27.1. CPU went back down, response time is at least 10x better
k
Yes there was a regression that slipped through in 27.0 which we fixed in 27.1
1