ยฉ 2025 Backdrop Build

community@backdropbuild.com

ยท

Backdrop Labs

ยท

FAQ

ยท

Twitter

ยท

Terms

ยท

Privacy

๐Ÿง‘โ€๐Ÿ”ง Personal AI Assistant

๐Ÿง‘โ€๐Ÿ”ง Personal AI Assistant

A personal ai assistant

+ 9 more

Try it here: https://github.com/samuelint/ai-assistant/releases

Note: The current version is not an MVP yet. - API Keys for OpenAI or Anthropic need to be set in the Settings page - MacOS: - The APP is not notarized, yet. To install and launch it. You need to allow it from OS Settings menu > Privacy & Security - Local models does not work out of the box for now. A model in the GGUF format need to be located on the computer at: `~/.cache/lm-studio/models/lmstudio-community/Meta-Llama-3-8B-Instruct-GGUF/Meta-Llama-3-8B-Instruct-Q4_K_M.gguf`. That is something to be fix in the upcomming version.

About

Building a **local** LLM & Agent platform. * Run LLM & Agents without the pain of having to setup everything around. * Single app deployment working out of the box (no need to install external deps [docker, python, etc...]). IT JUST WORKS * Multi LLM vendor * Out of the box local tools. For example: * Small / Large model access * Screenshot as context * Local embedding * Local message history * On demand privacy * Use local model when possible, otherwise use remote llm * When remote LLM is used, use obfuscation on private infos, if not possible inform the user

Builders

2
NC

Nicolas Carrier

No bio yet

Samuel Magny

Samuel Magny

Fun, passion, Father, Software Engineer, Amateur of Italian food & from Quebec!