Frequently Asked Questions
Everything you need to know about Caro
Getting Started
What is Caro?
Caro is a privacy-first AI shell assistant that converts natural language into safe, validated shell commands. It runs 100% locally on your machineโyour commands, file paths, and data never leave your computer.
How do I install Caro?
The easiest way is via cargo: cargo install caro. You can also use our install script: bash <(curl --proto '=https' --tlsv1.2 -sSfL https://setup.caro.sh). For other options, see our installation guide.
What are the system requirements?
Caro runs on macOS, Linux, and Windows (WSL). You need Rust 1.70+ for building from source. For local LLM inference, we recommend 8GB+ RAM. Apple Silicon Macs get hardware acceleration with MLX.
Do I need an API key to use Caro?
It depends on your backend choice. Local backends (Ollama, MLX, llama.cpp) require no API keys. Cloud backends (Anthropic, OpenAI) require API keys. Caro defaults to trying local inference first.
Safety & Security
How does Caro's safety validation work?
Caro uses pattern-based command validation, not AI-based filtering. Every generated command is checked against known dangerous patterns (rm -rf /, fork bombs, disk wipers) before you run it. This is deterministic, not probabilisticโthe same command always gets the same safety assessment.
What happens when Caro detects a dangerous command?
Caro shows a clear warning explaining why the command is dangerous and what it would do. You can still proceed if you intend to run itโCaro is a seatbelt, not a straitjacket. For truly destructive commands, you'll need to explicitly confirm.
Does Caro send my data to the cloud?
No. Caro runs 100% locally by default. Your commands, file paths, server names, and directory structures never leave your machine. If you choose to use a cloud backend (like Anthropic or OpenAI), only the natural language prompt is sentโnot your command history or file system data.
Is Caro safe to use with AI agents?
YesโCaro is designed with AI agents in mind. We recommend defense in depth: run agents as unprivileged users, sandbox to specific directories, use container isolation, and let Caro validate commands. Each layer catches what others miss.
How does Caro stay updated with new dangerous patterns?
Safety patterns are baked into the binary. When you update Caro (cargo install caro --force), you get the latest patterns. The core dangerous commands don't change. We also accept pattern contributions via GitHub.
Backends & Models
What LLM backends does Caro support?
Caro supports multiple backends: Local: Ollama, MLX (Apple Silicon), llama.cpp. Cloud: Anthropic Claude, OpenAI GPT. Caro automatically selects the best available backend or you can specify one explicitly.
Which backend should I use?
For privacy and offline use, choose local backends. On Apple Silicon Macs, MLX offers the best performance with hardware acceleration. Ollama is great cross-platform. For best quality responses, cloud backends like Claude typically perform better but require API keys.
How do I set up Ollama with Caro?
Install Ollama from ollama.ai, then run ollama pull llama3 (or another model). Caro will automatically detect Ollama when it's running. You can verify with caro --backend ollama "list files".
How do I use MLX on my Mac?
MLX is built into Caro for Apple Silicon Macs. Ensure you have the mlx Python package installed. Caro will automatically use MLX when available on M1/M2/M3 Macs for hardware-accelerated local inference.
Can I use my own fine-tuned models?
Yes! With Ollama, you can use any GGUF model. With MLX, you can use any MLX-compatible model. Point Caro to your model with the --model flag or set it in your config file.
Usage & Features
How do I use Caro?
Simply run caro "your request in natural language". For example: caro "find all Python files modified in the last week". Caro generates the command, shows it to you with a safety assessment, and asks for confirmation before running.
Can Caro execute commands automatically?
By default, Caro shows you the command and waits for confirmation. You can use --execute or -e to auto-execute safe commands, but dangerous commands always require explicit confirmation.
How does Caro handle different shells?
Caro detects your shell ($SHELL) at runtime and generates appropriate syntax. It works with bash, zsh, fish, and POSIX sh. On macOS, it knows you're using BSD tools; on Linux, it adjusts for GNU syntax.
Can I use Caro in scripts?
Yes! Use caro --quiet for scripting. You can pipe output, use in CI/CD, or integrate with other tools. The --output json flag outputs structured data for programmatic use.
How do I configure Caro?
Caro uses a config file at ~/.config/caro/config.toml. You can set your preferred backend, default model, safety level, and other options. Run caro --help to see all configuration options.
Troubleshooting
Caro says 'no backend available'โwhat do I do?
This means Caro couldn't find a local LLM or API key. Either: (1) Install and start Ollama with a model, (2) Set up MLX if on Apple Silicon, or (3) Set an API key for a cloud provider with export ANTHROPIC_API_KEY=....
Commands are generating slowlyโhow do I speed things up?
Local inference speed depends on your hardware and model size. Try a smaller model (7B instead of 70B), use MLX on Apple Silicon for acceleration, or switch to a cloud backend for faster responses.
The generated command is wrongโwhat should I do?
You can: (1) Rephrase your request to be more specific, (2) Add context like "on macOS" or "using find command", (3) Try a different backend/model, or (4) Report the issue on GitHub so we can improve.
How do I report a bug or request a feature?
Open an issue on our GitHub repository at github.com/wildcard/caro/issues. Include your OS, Caro version (caro --version), backend, and steps to reproduce.
Privacy & Telemetry
What data does Caro collect?
By default, Caro collects minimal, anonymous usage metrics to help us improve the productโthings like command success rates and which backends are used. No commands, file paths, or personal data are ever collected. See our telemetry page for details.
Can I disable telemetry?
Yes. Set telemetry = false in your config file or use export CARO_TELEMETRY=false. Caro will work exactly the same with telemetry disabled.
Is Caro GDPR compliant?
Yes. We don't collect personal data. All telemetry is anonymous and aggregated. You have full control over what data is sent (if any), and can disable telemetry entirely.
Contributing & Community
How can I contribute to Caro?
We welcome contributions! You can: submit bug reports, propose features, contribute code (it's Rust!), add safety patterns, improve documentation, or help with translations. See our contributing guide.
Is Caro open source?
Yes! Caro is licensed under AGPL-3.0. You can read, modify, and distribute the source code. The full codebase is available on GitHub.
How can I support Caro's development?
You can support us through GitHub Sponsors or Open Collective. Every contribution helps us maintain and improve Caro. See our support page for more details.
Still have questions?
Can't find what you're looking for? We're here to help.