Hi guys I am just experimenting with the new 26.0 ...
# community-help
d
Hi guys I am just experimenting with the new 26.0 version and I tried to index a dataset of around 1.5 GB json with 500k records FOLLOWING IS THE SCHEMA AND CODE AND SCHEMA IAM USING
Copy code
schema = {
    "name": "demo",
    "fields": [
        {"name": "fdcId", "type": "string"},

        {"name": "topThree", "type": "string[]"},
      
       

        {"name": "dataType", "type": "string","sort":True},
        {"name": "description", "type": "string", "token_separators": [",", "."]},
        {"name": "servingSize", "type": "string","index":False},
        {"name": "servingSizeUnit", "type": "string","index":False},
        {"name": "nutrient_tags", "type": "string[]","index":True},
        {"name": "ingredient_tags", "type": "string[]","index":True},
        {"name": "ingredients", "type": "string","index":True},
        
        
        
      
        {"name": "nutrients", "type": "auto","index":False},
        {"name": ".*", "type": "auto"},

 {
      "name" : "embedding",
      "type" : "float[]",
      "embed": {
        "from": [
          "dataType",
          "description",
          "servingSize",
          "servingSizeUnit",
          
          "ingredients",
          "ingredient_tags",
          "nutrient_tags"
          
        ],
        "model_config": {
          "model_name": "ts/e5-small"
        }
      }
    },        
        
    ],
}

with open('output_branded.jsonl') as jsonl_file:
  client.collections['Foods'].documents.import_(jsonl_file.read().encode('utf-8'), {'action': 'upsert','batch_size': 100})
after sometime the server throws error it becomes unresponsive I have also attached the logs
message has been deleted
k