There’s also an option to bring your own LLM, with fields for model name, endpoint, and API token available for entry when the manual option is enabled. However, the page itself warns local models may not work correctly.
It looks like there’s an option for people to self-host too. You won’t have to send your history to someone else’s computer.
It looks like there’s an option for people to self-host too. You won’t have to send your history to someone else’s computer.
If it’s anything like how they handled the AI sidebar, this option is going to get hidden before it hits production.
It would be really cool if they didn’t do that this time.