This PR makes it possible to run dataflow from python using:
```python
from dora import run
# Make sure to build it first with the CLI.
# Build step is on the todo list.
run("qwen2.5.yaml", uv=True)
```
Making it easier to run dataflow based on our needs
This pull request introduces a new Dora node called `dora-transformer`
that allows access to Hugging Face transformer models for text
generation and chat completion. The changes include adding a
comprehensive README, initializing the main functionality of the node,
defining dependencies, and adding a test file.
Key changes:
### Core functionality:
* Implemented the main logic in `dora_transformer/main.py` for loading
models, generating responses, and handling memory management.
## Summary by Sourcery
Introduces a new `dora-transformer` node that provides access to Hugging
Face transformer models for text generation and chat completion. The
node supports multi-platform GPU acceleration, memory-efficient model
loading, configurable system prompts, and conversation history
management.
New Features:
- Introduces a new `dora-transformer` node for text generation and chat
completion using Hugging Face transformer models.
- Provides multi-platform GPU acceleration support (CUDA, CPU, MPS).
- Offers memory-efficient model loading with 8-bit quantization.
- Supports configurable system prompts and activation words.
- Implements conversation history management.
- Integrates seamlessly with speech-to-text and text-to-speech
pipelines.
- Optimized for both small and large language models
Tests:
- Adds a test file to verify the basic functionality of the
`dora-transformer` node.
Changes
Migrated the entire robots directory from dora-lerobot to
examples/dora-lerobot/robots in the dora repository
Updated references in updated files.
Purpose
This migration consolidates robot examples into the main dora
repository, making it easier for users to discover and use these
components without needing to clone a separate repository.
This pull request introduces a new Dora node, `dora-llama-cpp-python`,
which provides access to LLaMA-based models using either llama.cpp or
Hugging Face backends for text generation.