Note: The current version is not an MVP yet. - API Keys for OpenAI or Anthropic need to be set in the Settings page - MacOS: - The APP is not notarized, yet. To install and launch it. You need to allow it from OS Settings menu > Privacy & Security - Local models does not work out of the box for now. A model in the GGUF format need to be located on the computer at: `~/.cache/lm-studio/models/lmstudio-community/Meta-Llama-3-8B-Instruct-GGUF/Meta-Llama-3-8B-Instruct-Q4_K_M.gguf`. That is something to be fix in the upcomming version.
About
Building a **local** LLM & Agent platform. * Run LLM & Agents without the pain of having to setup everything around. * Single app deployment working out of the box (no need to install external deps [docker, python, etc...]). IT JUST WORKS * Multi LLM vendor * Out of the box local tools. For example: * Small / Large model access * Screenshot as context * Local embedding * Local message history * On demand privacy * Use local model when possible, otherwise use remote llm * When remote LLM is used, use obfuscation on private infos, if not possible inform the user