I had some thoughts about privacy lately. Having a full local llm complicates things and makes quality difficult to maintain. However, people want local solutions for privacy. Azure OpenAI can offer extra security/privacy. Maybe offering the ability to bring your own token … including Azure Open AI as well as vector database options can allow one to use this on more sensitive data. Possibly the option of switching between which is used by workspace? Even switching to GPT 4 for some questions to Rich as well
We are actually working on incorporating additional LLM models into the chat in order for the user to chose which one to use to answer his questions. This is currently in development but will let you know as soon as its released.
The idea is to offer additional commercial models such as Bert, Claude, GPT4, etc… as well as experiment with some self hosted ones for greater privacy.
Will keep you posted