Darya
01/02/2025, 12:59 PMconv_model = {
"id": "conv-model-1",
"model_name": "openai/gpt-4-turbo",
"history_collection": "conversation_store",
"api_key": "mykey",
"system_prompt": "You are an assistant for question-answering of a food delivery service. You can only make conversations based on the provided context.",
"max_bytes": 16384
}
I am using Python API to get the results:
result = client.collections['menus'].documents.search({
'q': 'i am craving something sweet, which restaurants can i order from?',
'query_by': 'embedding',
'exclude_fields': 'embedding,
'conversation_model_id': "conv-model-1",
'conversation': True,
})
But no matter what I try, I always get the same answer: "It looks like I don't have specific restaurant data available right now. I recommend checking the app or website you're using for a list of local restaurants that offer sweet dishes or desserts. You can usually find a variety of options there!" However, the result['hits']
return correct results (a bunch of bakeries and candy stores).
Is it some misconfiguration on my end?Jason Bosco
01/02/2025, 3:59 PMDarya
01/02/2025, 10:12 PMts/paraphrase-multilingual-mpnet-base-v2
Jason Bosco
01/06/2025, 6:17 PMDarya
01/08/2025, 1:27 PMmax_bytes
and excluding more fields from the document. When the menus are too long I suppose the context gets cut off too early for the response to be able to use the documents properly