A Dora node that provides access to LLaMA-based models using either llama.cpp or Hugging Face backends for text generation.
uv venv -p 3.11 --seed
uv pip install -e .
The node can be configured in your dataflow YAML file:
- id: dora-llama-cpp-python
build: pip install -e path/to/dora-llama-cpp-python
path: dora-llama-cpp-python
inputs:
text: source_node/text # Input text to generate response for
outputs:
- text # Generated response text
env:
MODEL_BACKEND: "llama-cpp" # or "huggingface"
MODEL_REPO_ID: "Qwen/Qwen2.5-0.5B-Instruct-GGUF" # For llama-cpp backend
MODEL_FILENAME: "*fp16.gguf" # For llama-cpp backend
HF_MODEL_NAME: "Qwen/Qwen2.5-0.5B-Instruct" # For huggingface backend
SYSTEM_PROMPT: "You're a very succinct AI assistant with short answers."
ACTIVATION_WORDS: "what how who where you"
MAX_TOKENS: "512"
MODEL_BACKEND: Choose between:
llama-cpp: Uses GGUF models via llama.cpp (CPU-optimized, default)huggingface: Uses Hugging Face Transformers modelsSYSTEM_PROMPT: Customize the AI assistant's personality/behavior
ACTIVATION_WORDS: Space-separated list of words that trigger model response
This example shows how to create a conversational AI pipeline that:
nodes:
- id: dora-microphone
build: pip install dora-microphone
path: dora-microphone
inputs:
tick: dora/timer/millis/2000
outputs:
- audio
- id: dora-vad
build: pip install dora-vad
path: dora-vad
inputs:
audio: dora-microphone/audio
outputs:
- audio
- timestamp_start
- id: dora-whisper
build: pip install dora-distil-whisper
path: dora-distil-whisper
inputs:
input: dora-vad/audio
outputs:
- text
- id: dora-llama-cpp-python
build: pip install -e .
path: dora-llama-cpp-python
inputs:
text: dora-whisper/text
outputs:
- text
env:
MODEL_BACKEND: llama-cpp
MODEL_REPO_ID: "Qwen/Qwen2.5-0.5B-Instruct-GGUF"
MODEL_FILENAME: "*fp16.gguf"
SYSTEM_PROMPT: "You're a helpful assistant."
ACTIVATION_WORDS: "hey help what how"
- id: dora-tts
build: pip install dora-kokoro-tts
path: dora-kokoro-tts
inputs:
text: dora-llama-cpp-python/text
outputs:
- audio
dora build example.yml
dora run example.yml
uv pip install ruff
uv run ruff check . --fix
uv run ruff check .
uv pip install pytest
uv run pytest . # Test
dora-llama-cpp-python's code is released under the MIT License