v0.1.1 · File & Photo Organizer agents shipped
Local AI agents that quietly run your computer for you.
DeskPilot gives you a crew of tiny AI agents that live on your computer. They organize your files, sort your photos, and soon they'll summarize documents, rename screenshots, and more. Everything runs 100% locally via Ollama. No subscriptions. No cloud.
Requires Ollama and a local model (llama3.2:3b). DeskPilot can help you start and stop Ollama from inside the app.
Cleans up messy folders. Local AI proposes a folder structure. You review, approve, and can undo everything.
Sorts screenshots, work assets, and personal photos into neat folders. All decisions made by a local model.
- • Screenshot Renamer
- • Downloads Cleaner
- • Document Summarizer
- • Powered by Ollama
- • No cloud calls
- • Undo safety built in
How DeskPilot works
DeskPilot is a small desktop app built with Tauri. It talks to Ollama running on your machine, and each agent is a mix of local file operations and a small language model deciding what to do. No data ever leaves your computer.
Pick an agent
File Organizer, Photo Organizer, and soon many more. Each agent focuses on a small, useful task on your computer.
Local AI proposes a plan
The agent scans your files (locally) and asks a local LLM to propose what to do. You see a full preview before anything changes.
Approve, run, undo
Approve the plan to run it. If you do not like the outcome, you can undo everything in one click. Privacy and safety by default.
Why local AI instead of another cloud subscription?
Most AI tools want you to pay a monthly fee and upload your files to their servers. For organizing your Desktop or cleaning Downloads, that is overkill.
- No subscriptions: DeskPilot uses a free local model via Ollama. Once it is set up, there are no recurring costs.
- No cloud: All agents run on your machine. There are no API calls to any external service.
- No accounts: You do not need to sign up or create keys to start using it.
- • Tauri (Rust + React) desktop app
- • Tailwind + shadcn/ui for the interface
- • Ollama running a local model (llama3.2:3b)
- • Rust agents doing the actual file operations
- • Preview → approve → undo for every action
Frequently asked questions
No. DeskPilot talks to Ollama on localhost and uses Rust code to read and move files. There are no calls to remote APIs or servers.
No. The agents use a small local model (for example, llama3.2:3b) running in Ollama. That is free and runs entirely on your hardware.
A modern macOS or Windows machine with at least 8 GB of RAM is recommended for a smooth experience with local models. Agents that handle large folders or many photos will be faster with more RAM and a recent CPU.
The app is hosted on GitHub. You can inspect the code, open issues, and suggest agents you want to see next.