Integrating Chatbot with OpenAI and Using Vector Values

TLDR Denny asked how to use embedded, particularly vector, data with OpenAI in a chatbot. Kishore Nallan guided to use search responses in chatbot for summarisation.

Photo of Denny
Denny
Thu, 21 Sep 2023 05:34:50 UTC

Is there a way to some how pull my embedded data and use it with the openai chat completion API?

Photo of Kishore Nallan
Kishore Nallan
Thu, 21 Sep 2023 10:03:28 UTC

Can you elaborate what you mean by "embedded data"? Vector value?

Photo of Denny
Denny
Thu, 21 Sep 2023 10:03:40 UTC

yes, vector value

Photo of Kishore Nallan
Kishore Nallan
Thu, 21 Sep 2023 10:05:00 UTC

Yeah, that will be returned as part of search response

Photo of Denny
Denny
Thu, 21 Sep 2023 10:05:22 UTC

but how do i integrate it in a chat bot version?

Photo of Denny
Denny
Thu, 21 Sep 2023 10:05:41 UTC

because right now, if i search “Hi, please show me black TVs”, it will use every word in that sentence to query

Photo of Kishore Nallan
Kishore Nallan
Thu, 21 Sep 2023 10:06:57 UTC

You have to get the top few responses and feed that text into the chatbot and ask it to produce an answer from that

Photo of Denny
Denny
Thu, 21 Sep 2023 10:07:12 UTC

using fine tuning?

Photo of Kishore Nallan
Kishore Nallan
Thu, 21 Sep 2023 10:07:51 UTC

No need. You are basically asking the LLM to summarise the top results for you. For example for question answering.