Privacy by default
Scripts and prompts stay on your machine. No data leaves the device unless a step explicitly calls out.
Build automation chains that run on your machine. Add LLM steps that work offline. No cloud, no leaks.
Why local-first
Scripts and prompts stay on your machine. No data leaves the device unless a step explicitly calls out.
Open formats, swappable runtimes, BYO LLM keys. Take your chains anywhere — no proprietary cloud to escape.
Hotkeys, schedules, plugins — every layer is yours to configure, fork, or replace.
Coming soon
Drop a prompt step into any chain. Bring your own OpenAI / Anthropic / Mistral key — never ours.
Describe the workflow in a sentence. Sloddy scaffolds the nodes; you tweak and run.
Plug in a local Ollama runtime. Full LLM-powered automation, zero outbound traffic.