#community-help

Geo Load-Balanced Endpoint Causing Slow Performance

TLDR Edward reported slow responses in their geo load-balanced setup. Kishore Nallan suggested eliminating search_time_ms from the test, which revealed that the issue was not with the LB endpoint. The team will work on improving query latency with facets.

Powered by Struct AI

1

25
6mo
Solved
Join the chat
Mar 20, 2023 (6 months ago)
Edward
Photo of md5-abd5ee17039f6a27ff7f2bc0e3ef7131
Edward
12:16 PM
Having an issue with the Geo load-balanced endpoint in that responses are often very slow. We have had issues in the past with the load balanced endpoint but recently upgraded to have SDN to support our global offices but it seems that it’s not working correctly.

1st screenshot is with the ‘nearest_node’ option and the second is just reverting to using the 3 separate node endpoints. There’s a clear performance difference. I suspect this isn’t correct?

Third screenshot may be related in that sometimes the requests fail but it seems to only fail on the load-balanced endpoint.
Image 1 for Having an issue with the Geo load-balanced endpoint in that responses are often very slow. We have had issues in the past with the load balanced endpoint but recently upgraded to have SDN to support our global offices but it seems that it’s not working correctly.

1st screenshot is with the ‘nearest_node’ option and the second is just reverting to using the 3 separate node endpoints. There’s a clear performance difference. I suspect this isn’t correct?

Third screenshot may be related in that sometimes the requests fail but it seems to only fail on the load-balanced endpoint.Image 2 for Having an issue with the Geo load-balanced endpoint in that responses are often very slow. We have had issues in the past with the load balanced endpoint but recently upgraded to have SDN to support our global offices but it seems that it’s not working correctly.

1st screenshot is with the ‘nearest_node’ option and the second is just reverting to using the 3 separate node endpoints. There’s a clear performance difference. I suspect this isn’t correct?

Third screenshot may be related in that sometimes the requests fail but it seems to only fail on the load-balanced endpoint.
Kishore Nallan
Photo of md5-4e872368b2b2668460205b409e95c2ea
Kishore Nallan
12:19 PM
👋 Are you using any proxy or do you requests get routed via some firewall?

1

Edward
Photo of md5-abd5ee17039f6a27ff7f2bc0e3ef7131
Edward
12:21 PM
Yes, we do have cloudflare installed. Could that be it?
Kishore Nallan
Photo of md5-4e872368b2b2668460205b409e95c2ea
Kishore Nallan
12:22 PM
Yes, it's possible that somehow the outgoing IP is not getting matched with the nearest geo node OR the outgoing IP is actually somewhere further out than what you are expecting it to be.
12:23
Kishore Nallan
12:23 PM
Try using a machine that is not connected to your corporate network and try making the same requests.
Edward
Photo of md5-abd5ee17039f6a27ff7f2bc0e3ef7131
Edward
12:25 PM
There’s no corporate network as such and but all requests go through cloudflare which acts as a waf for the application
Kishore Nallan
Photo of md5-4e872368b2b2668460205b409e95c2ea
Kishore Nallan
12:26 PM
Yes, that will proxy your requests
12:27
Kishore Nallan
12:27 PM
But wait, this is for the application only right? Those XHR requests are hitting Typesense directly?
Edward
Photo of md5-abd5ee17039f6a27ff7f2bc0e3ef7131
Edward
12:28 PM
yes, and we were just discussing that we were experiencing this in local development too
Kishore Nallan
Photo of md5-4e872368b2b2668460205b409e95c2ea
Kishore Nallan
12:28 PM
Can you also verify the search_time_ms value in the responses just to be sure?
12:29
Kishore Nallan
12:29 PM
Total time shown by browser is a sum of search_time_ms and actual network latency.
12:30
Kishore Nallan
12:30 PM
To truly test this, pick a really fast query that gets executed within a few milliseconds as per search_time_ms and then try running that via both LB and non-LB end-points.
Edward
Photo of md5-abd5ee17039f6a27ff7f2bc0e3ef7131
Edward
12:31 PM
We’ll check now. The search_time_in_ms on the non-load-balanced configuration lags about 100ms on the total time
Toomas
Photo of md5-57dbf2e1f66b54b8c2098cc5a7747003
Toomas
12:34 PM
Initial load of non-query results with nearest_node enabled: 1.82s network call - search_time_ms: 1691 - This was tested in a local dev environment
Edward
Photo of md5-abd5ee17039f6a27ff7f2bc0e3ef7131
Edward
12:35 PM
This was run locally too
Kishore Nallan
Photo of md5-4e872368b2b2668460205b409e95c2ea
Kishore Nallan
12:36 PM
Can we totally eliminate search_time_ms from the equation? Pick a query that produces no results, for e.g. like a keyword like afkjfkjdskjfsd that won't exist.
12:37
Kishore Nallan
12:37 PM
Or if the end-point and the app is already public, DM me the cluster ID and I can debug this for you.
Edward
Photo of md5-abd5ee17039f6a27ff7f2bc0e3ef7131
Edward
12:40 PM
We ran that search and it was pretty much the same across both configurations - 8/9 ms
Kishore Nallan
Photo of md5-4e872368b2b2668460205b409e95c2ea
Kishore Nallan
12:42 PM
Then I don't think the LB end-point is having an issue. Because if you remove the actual query latency, everything else will be network latency.
Edward
Photo of md5-abd5ee17039f6a27ff7f2bc0e3ef7131
Edward
12:44 PM
There is still a significant difference between them which is odd.
Kishore Nallan
Photo of md5-4e872368b2b2668460205b409e95c2ea
Kishore Nallan
12:44 PM
With the normal queries?
Edward
Photo of md5-abd5ee17039f6a27ff7f2bc0e3ef7131
Edward
12:45 PM
Our search is much more geared to using facets and filters than the actual search bar too. not sure if that would change things?
12:45
Edward
12:45 PM
Yes, with normal queries and initial load
Kishore Nallan
Photo of md5-4e872368b2b2668460205b409e95c2ea
Kishore Nallan
12:46 PM
I don't think the LB configuration can be a culprit if it is not reproduced across simple queries. The good news I have for you regarding general query latency with facets is that we are actively working to fix that so should see drastic improvements in a few weeks.
Edward
Photo of md5-abd5ee17039f6a27ff7f2bc0e3ef7131
Edward
12:47 PM
oooh. That’s good to hear.