Documentation Index
Fetch the complete documentation index at: https://node-guide.dria.co/llms.txt
Use this file to discover all available pages before exploring further.
Node Setup
Do I need Ollama?
Do I need Ollama?
No. The new
dria-node binary runs models natively using llama.cpp. Ollama is no longer required. Simply run dria-node setup to download and configure your model.What are the minimum system requirements?
What are the minimum system requirements?
- RAM: At least ~1 GB for the smallest model (
qwen3.5:0.8b), up to ~27 GB for the largest (nemotron:30b-a3b) - Disk: 0.5 GB to 24.5 GB for model files depending on model choice
- Network: Outbound UDP port 4001 (QUIC)
- OS: macOS (Intel + Apple Silicon), Linux (x86_64, arm64), Windows (x86_64)
How do I install dria-node?
How do I install dria-node?
The fastest way:
- macOS/Linux:
brew install firstbatchxyz/dkn/dria-nodeorcurl -fsSL https://raw.githubusercontent.com/firstbatchxyz/dkn-compute-node/master/install.sh | sh - Windows:
irm https://raw.githubusercontent.com/firstbatchxyz/dkn-compute-node/master/install.ps1 | iex
Running a Node
How do I check if my node is online?
How do I check if my node is online?
Visit the Dria Edge AI Dashboard and log in with the wallet associated with your node. The dashboard shows your node’s status, recent activity, and earned points.
Can I run multiple nodes?
Can I run multiple nodes?
Yes. You can run multiple nodes on the same machine or network, but each node must use a unique private key (wallet). Using the same key for multiple nodes will cause conflicts with task assignment and rewards.
What ports do I need open?
What ports do I need open?
Your node needs outbound access on UDP port 4001 to connect to the Dria router via QUIC. No inbound ports need to be opened.
Does dria-node support GPU acceleration?
Does dria-node support GPU acceleration?
Yes.
dria-node supports:- Apple Metal (macOS) — works automatically on Apple Silicon
- NVIDIA CUDA — use the CUDA build
- AMD ROCm 6.x — use the ROCm install script (Linux x86_64 only)
--gpu-layers -1 to offload all model layers to GPU.How does auto-update work?
How does auto-update work?
dria-node checks for new versions on GitHub Releases at startup. Patch bumps (e.g. 0.7.3 → 0.7.4) show a warning. Minor or major version bumps trigger an automatic update to keep your node compatible with the network. You can skip this with --skip-update.Earning Points
Why am I not earning points?
Why am I not earning points?
Common reasons:
- No tasks — There may be low demand for the model(s) you’re running. Check the dashboard and consider switching to higher-demand models.
- Node offline — Make sure
dria-nodeis running and your network allows outbound UDP on port 4001. - Low reputation — New nodes start with a neutral reputation score. Complete tasks consistently to build it up. Failing or timing out on tasks reduces your score.
How often are $DRIA Points updated?
How often are $DRIA Points updated?
Points on the Dria Edge AI Dashboard are updated in near real-time as your node completes tasks.
Using the Network (CLI)
How do I use Dria for inference?
How do I use Dria for inference?
How does payment work for using the network?
How does payment work for using the network?
The CLI uses USDC credits on the Base network. Top up with:This uses the x402 payment protocol with gasless EIP-712 signed transfers. See CLI Guide for details.
Troubleshooting
My node keeps disconnecting
My node keeps disconnecting
The node automatically reconnects with exponential backoff (1s, 2s, 4s, 8s, 16s). If it repeatedly fails:
- Check your internet connection
- Ensure outbound UDP port 4001 is not blocked by a firewall
- Try setting
RUST_LOG=debugfor more detailed logs - Make sure you’re running the latest version (
dria-nodeauto-updates by default)
Model download fails
Model download fails
Models are downloaded from HuggingFace. If the download fails:
- Check your internet connection
- Ensure you have enough disk space in
~/.dria/models/ - Try running
dria-node setupagain — it will resume where it left off
Should I avoid WSL on Windows?
Should I avoid WSL on Windows?
We recommend using native Windows (PowerShell) instead of WSL for the best experience. If you encounter issues on WSL, switch to the native Windows installation method.
