#community-help

Resolving Out of Memory Issue with Cluster

TLDR Ed encountered an OUT_OF_MEMORY error while using typesense. Jason suggested increasing the cluster's RAM, accounting for the OS, which resolved the issue.

Powered by Struct AI
15
2w
Solved
Join the chat
Sep 16, 2023 (2 weeks ago)
Ed
Photo of md5-120c789e9edae8b90bf59cf0e2612b66
Ed
08:57 PM
Hi team, does this mean my cluster ran out of memory?
"message": "Rejecting write: running out of resource type: OUT_OF_MEMORY"}
{'num_deleted': 0}
Traceback (most recent call last):
  File "/Users/edondurguti/learning/tke_jobboard/typesense_upload.py", line 170, in <module>
    upload_and_delete_typesense()
  File "/Users/edondurguti/learning/tke_jobboard/typesense_upload.py", line 151, in upload_and_delete_typesense
    upsert_docs, ids = import_and_upsert_documents(
  File "/Users/edondurguti/learning/tke_jobboard/typesense_upload.py", line 83, in import_and_upsert_documents
    upsert = client.collections[collection_name].documents.import_(
  File "/Users/edondurguti/learning/tke_jobboard/.venv/lib/python3.10/site-packages/typesense/documents.py", line 94, in import_
    api_response = (self._endpoint_path('import'), documents, params, as_json=False)
  File "/Users/edondurguti/learning/tke_jobboard/.venv/lib/python3.10/site-packages/typesense/api_call.py", line 153, in post
    return self.make_request(, endpoint, as_json,
  File "/Users/edondurguti/learning/tke_jobboard/.venv/lib/python3.10/site-packages/typesense/api_call.py", line 116, in make_request
    raise ApiCall.get_exception(r.status_code)(r.status_code, error_message)
typesense.exceptions.ObjectUnprocessable: [Errno 422] Rejecting write: running out of resource type: OUT_OF_MEMORY
Jason
Photo of md5-8813087cccc512313602b6d9f9ece19f
Jason
08:58 PM
Yeah, you would have to upgrade your RAM, under Cluster Configuration &gt; Modify
Ed
Photo of md5-120c789e9edae8b90bf59cf0e2612b66
Ed
08:59 PM
hmmm
08:59
Ed
08:59 PM
i really don’t have much there actually
09:00
Ed
09:00 PM
these are my file on disk
Image 1 for these are my file on disk
09:01
Ed
09:01 PM
and I use this function to upsert/delete docs
def upload_and_delete_typesense():
    client = create_client()
    # new_collection_name = "amso_zh"
    for lang in fe_languages:
        new_collection_name = f"tke_{lang}"
        upsert_docs, ids = import_and_upsert_documents(
            client, f"algolia_jobs-{lang}.json", new_collection_name
        )
        deleted = delete_document(client, new_collection_name, ids)
        pprint(deleted)
09:02
Ed
09:02 PM
Image 1 for
09:06
Ed
09:06 PM
is there a way to see what’s consuming this much memory, I am running on 0.5GB but with my files (the way I read this) I should have at least 2-3 times of my of the amount the collections take on the disk
09:21
Ed
09:21 PM
here’s my upsert function aswell
def import_and_upsert_documents(client, json_file, collection_name):
    # Open and read the JSON data from the file
    with open(json_file) as f:
        json_data = json.load(f)

    # get all 'id' fields
    ids = [
        doc["data"]["idClient"] for doc in json_data
    ]  # we are doing this here since we already opened the file

    # Convert list of JSON objects to JSONL format
    jsonl_data = "\n".join(json.dumps(doc) for doc in json_data)

    # Import documents into the collection
    upsert = client.collections[collection_name].documents.import_(
        jsonl_data.encode("utf-8"), {"action": "upsert"}
    )
    return upsert, ids
Jason
Photo of md5-8813087cccc512313602b6d9f9ece19f
Jason
10:39 PM
We don't store per index memory consumption metrics... I see a momentary spike in memory, so it's likely that the data being indexed didn't fit in the remaining amount of RAM.
Ed
Photo of md5-120c789e9edae8b90bf59cf0e2612b66
Ed
10:50 PM
yeah I was importing the jobs when I got that error, but my total data at typesense could be no more than 50MB, so I’m wondering why I got this error
Jason
Photo of md5-8813087cccc512313602b6d9f9ece19f
Jason
10:51 PM
The OS usually takes up about 250MB of RAM by itself… and a 50MB file could take as much as 150MB of RAM.
Ed
Photo of md5-120c789e9edae8b90bf59cf0e2612b66
Ed
10:57 PM
ah ok i didn’t account for the OS
10:57
Ed
10:57 PM
I upgraded to 2GB