Is LiteLLM OpenAI compatible? From a quick read, I...
# community-help
f
Is LiteLLM OpenAI compatible? From a quick read, I saw that it standardizes the API format, but is it hosted on a remote URL like we describe in the docs? Apart from that, if you want to use your own model, you can pass in the embeddings generated from the model yourself as well. Lastly, could you describe what part of the documentation based on caching is not up to par? We strive for a fully documented API and feedback is always welcome