Hello! Recently we noticed that one of our fields ...
# community-help
l
Hello! Recently we noticed that one of our fields types wasn’t the best type for the data we inserted (type was int64 and the data was bigger than the max value of int64 which lead to corrupted fields in the index). We updated the field in the schema to be a string and when I run a search, I see that the Typesense search page still returns the old corrupted data while the search through the api returns the correct data. Is there some kind of caching somewhere ?
f
By Typesense search page do you mean the one on Typesense Cloud?
l
Yes!
f
And did you create another field or update the old one by dropping it?
l
I ran a schema update with a drop + create like this
Copy code
{
  "fields": [
    {
      "name": "my_field",
      "drop": true
    },
    {
      "name": "my_field",
      "type": "string"
    }
  ]
}
f
And the result from the API is returning back normally and Typesense cloud is displaying the old one? Could you send me the response tab from the request on your browser's developer tools?
l
And the result from the API is returning back normally and Typesense cloud is displaying the old one?
Yes I can’t share the full response here but I can share with you that in the response tab, the data is correct 😅 Response tab:
"hash": 10722174425384670420,
Search page:
"hash": 10722174425384671000,
In any case, it’s not critical 🙂 I was just thrown off because I just updated the field in the schema and saw that the Typesense Cloud Search page showed the exact same result as before. It’s probably a small display bug.
j
This sounds like a Numeric overflow issue in Javascript... The API is still returning the result as an integer and not a string, and then when the browser tries to render the large numeric value, it truncates it after max precision it supports. But the larger issue is that, I don't think we support converting datatypes like this when using alters. The schema says string, but the actual return datatype in the API is a number. So the conversion didn't actually work. I'd recommend dropping this field, setting it to null across all records, and then changing the datatype and re-updating just this field across all records and send the JSON value as a string instead of an integer
l
Indeed, the actual result that is returned is still an integer. What I decided to do was to do a pass on the whole dataset and just stringify the existing numerical hash and then perform an update of the records. So far, this is working well and the Javascript Numeric overflow disappears for the data that has been updated.
🙌 2