Kasen Stephensen
07/21/2025, 5:12 PMprefix: false
), but I noticed that grouping documents (while technically allowed) messes up the AI conversation response because it looks like the grouped hits are not passed as context. Here's my code:
const searchParams: any = {
q: query,
query_by: DEFAULT_QUERY_BY,
exclude_fields: EXCLUDE_FIELDS,
conversation_model_id: CONVERSATION_MODEL_ID,
conversation: true,
per_page: DEFAULT_PER_PAGE,
prefix: CONVERSATION_PREFIX,
// group_by: 'slug',
// group_limit: 3, // we're able to add these and the database query works, but the llm response is inadequate
filter_by: 'status:active'
}
// Add conversation ID for follow-up questions
if (conversationId) {
searchParams.conversation_id = conversationId
}
const response = await typesense
.collections(OPPORTUNITIES_COLLECTION)
.documents()
.search(searchParams)
Fanis Tharropoulos
07/21/2025, 5:14 PMKasen Stephensen
07/21/2025, 5:22 PM/ Typesense Conversation Model Configuration
// Query Parameters
export const DEFAULT_QUERY_BY = 'embedding' // Auto-embedding field for semantic search
export const EXCLUDE_FIELDS = 'embedding' // Fields to exclude from search results
export const CONVERSATION_PREFIX = false // Disable prefix matching for conversations
Fanis Tharropoulos
07/21/2025, 5:23 PM