Hello , I have a use case where in i need to filte...
# community-help
c
Hello , I have a use case where in i need to filter_by a field and the values provided to filter_by will be around 300 plus values in a comma separated , for eg :
filter_by: "dataProperties:=[value1,value2,value3,...,value300]"
But this approach will throw error due to data limitation , since it has huge data.
r
use a multi search to get around the input payload size constraint
1
c
we are using multisearch end point with POST So do we need to split the search in to chunks as mentioned below ?
Copy code
{
  "searches": [
    {
      "collection": "your_collection_name",
      "q": "*",
      "filter_by": "field_name:=[value1,value2,...,value50]"
    },
    {
      "collection": "your_collection_name",
      "q": "*",
      "filter_by": "field_name:=[value51,value52,...,value100]"
    },
    {
      "collection": "your_collection_name",
      "q": "*",
      "filter_by": "field_name:=[value101,value102,...,value150]"
    },
    {
      "collection": "your_collection_name",
      "q": "*",
      "filter_by": "field_name:=[value151,value152,...,value200]"
    }
  ]
}