About
Building a **local** LLM & Agent platform. * Run LLM & Agents without the pain of having to setup everything around. * Single app deployment working out of the box (no need to install external deps [docker, python, etc...]). IT JUST WORKS * Multi LLM vendor * Out of the box local tools. For example: * Small / Large model access * Screenshot as context * Local embedding * Local message history * On demand privacy * Use local model when possible, otherwise use remote llm * When remote LLM is used, use obfuscation on private infos, if not possible inform the user