#community-help

Typesense Import Issue with HTTP Code 503 Error

TLDR Tomas faced errors while importing to typesense, including an HTTP code 503. Jason identified the issue as CPU exhaustion and recommended slowing down writes or upgrading to at least 4vCPU.

Powered by Struct AI

1

1

May 24, 2023 (6 months ago)
Tomas
Photo of md5-5b271c7a4e76157e095e31538e4331ed
Tomas
06:27 PM
Hello!
Trying to import to typesense, I had the error Request failed with HTTP code 503 | Server said: Not Ready or Lagging and when I check the cluster, only has 3k of 19k records.
Now, when I do the request to the schema information, I see that there are 3125 contacts, but if I use a filterBy, I see around 9k records.

I tried to import again, and it doesn’t throw errors, but I have the same amount of records.

Did I broke the cluster? Is there a way to solve it?

Thank you!Hello!
Trying to import to typesense, I had the error Request failed with HTTP code 503 | Server said: Not Ready or Lagging and when I check the cluster, only has 3k of 19k records.
Now, when I do the request to the schema information, I see that there are 3125 contacts, but if I use a filterBy, I see around 9k records.

I tried to import again, and it doesn’t throw errors, but I have the same amount of records.

Did I broke the cluster? Is there a way to solve it?

Thank you!
Image 1 for Hello!
Trying to import to typesense, I had the error `Request failed with HTTP code 503 | Server said: Not Ready or Lagging` and when I check the cluster, only has 3k of 19k records.
Now, when I do the request to the schema information, I see that there are 3125 contacts, but if I use a `filterBy`, I see around 9k records.

I tried to import again, and it doesn’t throw errors, but I have the same amount of records.

Did I broke the cluster? Is there a way to solve it?

Thank you!Hello!
Trying to import to typesense, I had the error `Request failed with HTTP code 503 | Server said: Not Ready or Lagging` and when I check the cluster, only has 3k of 19k records.
Now, when I do the request to the schema information, I see that there are 3125 contacts, but if I use a `filterBy`, I see around 9k records.

I tried to import again, and it doesn’t throw errors, but I have the same amount of records.

Did I broke the cluster? Is there a way to solve it?

Thank you!Image 2 for Hello!
Trying to import to typesense, I had the error `Request failed with HTTP code 503 | Server said: Not Ready or Lagging` and when I check the cluster, only has 3k of 19k records.
Now, when I do the request to the schema information, I see that there are 3125 contacts, but if I use a `filterBy`, I see around 9k records.

I tried to import again, and it doesn’t throw errors, but I have the same amount of records.

Did I broke the cluster? Is there a way to solve it?

Thank you!Hello!
Trying to import to typesense, I had the error `Request failed with HTTP code 503 | Server said: Not Ready or Lagging` and when I check the cluster, only has 3k of 19k records.
Now, when I do the request to the schema information, I see that there are 3125 contacts, but if I use a `filterBy`, I see around 9k records.

I tried to import again, and it doesn’t throw errors, but I have the same amount of records.

Did I broke the cluster? Is there a way to solve it?

Thank you!Image 3 for Hello!
Trying to import to typesense, I had the error `Request failed with HTTP code 503 | Server said: Not Ready or Lagging` and when I check the cluster, only has 3k of 19k records.
Now, when I do the request to the schema information, I see that there are 3125 contacts, but if I use a `filterBy`, I see around 9k records.

I tried to import again, and it doesn’t throw errors, but I have the same amount of records.

Did I broke the cluster? Is there a way to solve it?

Thank you!Hello!
Trying to import to typesense, I had the error `Request failed with HTTP code 503 | Server said: Not Ready or Lagging` and when I check the cluster, only has 3k of 19k records.
Now, when I do the request to the schema information, I see that there are 3125 contacts, but if I use a `filterBy`, I see around 9k records.

I tried to import again, and it doesn’t throw errors, but I have the same amount of records.

Did I broke the cluster? Is there a way to solve it?

Thank you!Image 4 for Hello!
Trying to import to typesense, I had the error `Request failed with HTTP code 503 | Server said: Not Ready or Lagging` and when I check the cluster, only has 3k of 19k records.
Now, when I do the request to the schema information, I see that there are 3125 contacts, but if I use a `filterBy`, I see around 9k records.

I tried to import again, and it doesn’t throw errors, but I have the same amount of records.

Did I broke the cluster? Is there a way to solve it?

Thank you!Hello!
Trying to import to typesense, I had the error `Request failed with HTTP code 503 | Server said: Not Ready or Lagging` and when I check the cluster, only has 3k of 19k records.
Now, when I do the request to the schema information, I see that there are 3125 contacts, but if I use a `filterBy`, I see around 9k records.

I tried to import again, and it doesn’t throw errors, but I have the same amount of records.

Did I broke the cluster? Is there a way to solve it?

Thank you!
Jason
Photo of md5-8813087cccc512313602b6d9f9ece19f
Jason
06:49 PM
Could you DM me your cluster ID? I can then take a closer look
06:51
Jason
06:51 PM
In the meantime, here’s general information about the 503 error: https://typesense.org/docs/guide/syncing-data-into-typesense.html#handling-http-503s
06:52
Jason
06:52 PM
In your case though it seems like your cluster doesn’t have sufficient CPU capacity to process the writes in parallel, so it’s going into a lag
06:52
Jason
06:52 PM
> Now, when I do the request to the schema information, I see that there are 3125 contacts, but if I use a filterBy, I see around 9k records.
This happens temporarily as the documents in the write queue are processed
Tomas
Photo of md5-5b271c7a4e76157e095e31538e4331ed
Tomas
07:27 PM
Thank you! I will send you by DM
Jason
Photo of md5-8813087cccc512313602b6d9f9ece19f
Jason
07:58 PM
Tomas I took a look at your cluster and it’s indeed CPU exhaustion like I mentioned above
07:58
Jason
07:58 PM
May I know if you’re using the batch import endpoint or the single document creation endpoint?
Tomas
Photo of md5-5b271c7a4e76157e095e31538e4331ed
Tomas
09:45 PM
I imported many chunks of 2k records
May 25, 2023 (6 months ago)
Jason
Photo of md5-8813087cccc512313602b6d9f9ece19f
Jason
01:52 AM
Tomas I would recommend either slowing down the volume of writes when Typesense returns a 503, or upgrading to at least 4vCPU to increase the volume of parallel writes that can be supported

1

Tomas
Photo of md5-5b271c7a4e76157e095e31538e4331ed
Tomas
04:44 PM
Thank you Jason!

1

Typesense

Lightning-fast, open source search engine for everyone | Knowledge Base powered by Struct.AI

Indexed 3015 threads (79% resolved)

Join Our Community

Similar Threads

Troubleshooting Indexing Duration in Typesense Import

Alan asked about lengthy indexing times for importing documents to Typesense. Jason suggested various potential causes, including network connectivity and system resources. They later identified the problem to be an error in Alan's code.

5

43
15mo

Revisiting Typesense for Efficient DB Indexing and Querying

kopach experienced slow indexing and crashes with Typesense. The community suggested to use batch import and check the server's resources. Improvements were made but additional support was needed for special characters and multi-search queries.

1

46
9mo

Optimizing Typesense Implementation for Large Collections

Oskar faced performance issues with his document collection in Typesense due to filter additions. Jason suggested trying a newer Typesense build and potentially partitioning the data into country-wise collections. They also discussed reducing network latency with CDN solutions.

5

67
11mo

Troubleshooting Typesense Document Import Error

Christopher had trouble importing 2.1M documents into Typesense due to memory errors. Jason clarified the system requirements, explaining the correlation between RAM and dataset size, and ways to tackle the issue. They both also discussed database-like query options.

3

30
10mo

Improving Record Retrieval Speed from Typesense

Yoshi sought ways to accelerate Typesense record retrieval. Jason advised upgrading to high availability and using the documents/export endpoint. They also noted a high volume of writes consuming significant CPU capacity as a possible performance factor.

1

12
3mo