| @@ -256,6 +256,67 @@ nodes: | |||
| The full documentation is available on [our website](https://dora-rs.ai/). | |||
| A lot of guides are available on [this section](https://dora-rs.ai/docs/guides/) of our website. | |||
| ### Documentation Structure | |||
| ``` | |||
| dora/docs/src/ | |||
| ├── api/ | |||
| │ ├── python/ | |||
| │ │ └── README.md | |||
| │ ├── rust/ | |||
| │ │ └── README.md | |||
| │ └── README.md | |||
| ├── architecture/ | |||
| │ └── README.md | |||
| ├── assets/ | |||
| │ ├── logo.svg | |||
| │ ├── bar_chart_dark.svg | |||
| │ ├── bar_chart_light.svg | |||
| │ └── ... | |||
| ├── community/ | |||
| │ └── README.md | |||
| ├── examples/ | |||
| │ ├── basic/ | |||
| │ │ ├── README.md | |||
| │ │ └── hello_world.md | |||
| │ └── advanced/ | |||
| │ ├── README.md | |||
| │ └── image_processing.md | |||
| ├── quickstart/ | |||
| │ └── README.md | |||
| └── troubleshooting/ | |||
| └── README.md | |||
| ``` | |||
| ### Quick Start | |||
| - [Getting Started Guide](docs/src/quickstart/README.md) - Step-by-step tutorial for absolute beginners | |||
| - [System Requirements](docs/src/quickstart/README.md#system-requirements) - Hardware and software prerequisites | |||
| - [Installation Guide](docs/src/quickstart/README.md#quick-installation) - Installation instructions for different platforms | |||
| ### API Documentation | |||
| - [Python API Guide](docs/src/api/python/README.md) - Python API documentation with examples | |||
| - [Rust API Guide](docs/src/api/rust/README.md) - Rust API documentation with examples | |||
| - [Performance Considerations](docs/src/api/README.md#performance-considerations) - Best practices for performance | |||
| ### Architecture and Concepts | |||
| - [Architecture Overview](docs/src/architecture/README.md) - System architecture and design principles | |||
| - [Data Flow](docs/src/architecture/README.md#data-flow) - Understanding data flow between nodes | |||
| - [Core Components](docs/src/architecture/README.md#core-components) - Key system components | |||
| ### Examples and Tutorials | |||
| - [Basic Examples](docs/src/examples/basic/README.md) - Simple examples to get started | |||
| - [Advanced Examples](docs/src/examples/advanced/README.md) - Complex use cases and patterns | |||
| - [Performance Examples](docs/src/examples/advanced/README.md#performance-considerations) - Optimization examples | |||
| ### Troubleshooting | |||
| - [Common Issues](docs/src/troubleshooting/README.md#common-issues) - Solutions to common problems | |||
| - [Debugging Tools](docs/src/troubleshooting/README.md#debugging-tools) - Tools and techniques for debugging | |||
| - [Performance Tuning](docs/src/troubleshooting/README.md#performance-tuning) - Guidelines for performance optimization | |||
| ### Community | |||
| - [Community Guidelines](docs/src/community/README.md) - How to get involved | |||
| - [Support Channels](docs/src/community/README.md#support-channels) - Where to get help | |||
| - [Contributing Guide](docs/src/community/README.md#contributing) - How to contribute | |||
| ## What is Dora? And what features does Dora offer? | |||
| **D**ataflow-**O**riented **R**obotic **A**rchitecture (`dora-rs`) is a framework that makes creation of robotic applications fast and simple. | |||
| @@ -0,0 +1,305 @@ | |||
| # Dora-RS API Documentation | |||
| ## Overview | |||
| This document provides comprehensive documentation for the Dora-RS API, including both Python and Rust interfaces. | |||
| ## Python API | |||
| ### Core Components | |||
| #### Node Class | |||
| ```python | |||
| from dora import Node | |||
| class MyNode(Node): | |||
| def __init__(self): | |||
| super().__init__() | |||
| def on_input(self, input_data): | |||
| # Process input data | |||
| return processed_data | |||
| ``` | |||
| #### Data Types | |||
| ```python | |||
| from dora import DataType | |||
| # Supported data types | |||
| class SupportedTypes: | |||
| STRING = DataType.STRING | |||
| INT = DataType.INT | |||
| FLOAT = DataType.FLOAT | |||
| BYTES = DataType.BYTES | |||
| JSON = DataType.JSON | |||
| ``` | |||
| #### Configuration | |||
| ```python | |||
| from dora import Config | |||
| config = Config( | |||
| name="my_node", | |||
| inputs=["input1", "input2"], | |||
| outputs=["output1"], | |||
| parameters={ | |||
| "param1": "value1", | |||
| "param2": 42 | |||
| } | |||
| ) | |||
| ``` | |||
| ### Advanced Features | |||
| #### Async Support | |||
| ```python | |||
| from dora import AsyncNode | |||
| class AsyncMyNode(AsyncNode): | |||
| async def on_input(self, input_data): | |||
| # Async processing | |||
| return processed_data | |||
| ``` | |||
| #### Error Handling | |||
| ```python | |||
| from dora import NodeError | |||
| class SafeNode(Node): | |||
| def on_input(self, input_data): | |||
| try: | |||
| # Processing | |||
| return result | |||
| except Exception as e: | |||
| raise NodeError(f"Processing failed: {str(e)}") | |||
| ``` | |||
| ## Rust API | |||
| ### Core Components | |||
| #### Node Trait | |||
| ```rust | |||
| use dora_node_api::Node; | |||
| struct MyNode { | |||
| // Node state | |||
| } | |||
| impl Node for MyNode { | |||
| fn on_input(&mut self, input: Input) -> Result<Output, Error> { | |||
| // Process input | |||
| Ok(output) | |||
| } | |||
| } | |||
| ``` | |||
| #### Data Types | |||
| ```rust | |||
| use dora_node_api::DataType; | |||
| // Supported data types | |||
| enum SupportedTypes { | |||
| String, | |||
| Int, | |||
| Float, | |||
| Bytes, | |||
| Json, | |||
| } | |||
| ``` | |||
| #### Configuration | |||
| ```rust | |||
| use dora_node_api::Config; | |||
| let config = Config::new("my_node") | |||
| .with_inputs(vec!["input1", "input2"]) | |||
| .with_outputs(vec!["output1"]) | |||
| .with_parameters(hashmap! { | |||
| "param1".to_string() => "value1".to_string(), | |||
| "param2".to_string() => "42".to_string(), | |||
| }); | |||
| ``` | |||
| ### Advanced Features | |||
| #### Async Support | |||
| ```rust | |||
| use dora_node_api::AsyncNode; | |||
| struct AsyncMyNode { | |||
| // Node state | |||
| } | |||
| #[async_trait] | |||
| impl AsyncNode for AsyncMyNode { | |||
| async fn on_input(&mut self, input: Input) -> Result<Output, Error> { | |||
| // Async processing | |||
| Ok(output) | |||
| } | |||
| } | |||
| ``` | |||
| #### Error Handling | |||
| ```rust | |||
| use dora_node_api::{Node, NodeError}; | |||
| struct SafeNode { | |||
| // Node state | |||
| } | |||
| impl Node for SafeNode { | |||
| fn on_input(&mut self, input: Input) -> Result<Output, Error> { | |||
| // Processing | |||
| Ok(output) | |||
| } | |||
| } | |||
| ``` | |||
| ## Common Patterns | |||
| ### 1. Data Transformation | |||
| ```python | |||
| # Python | |||
| def transform_data(input_data): | |||
| # Transform data | |||
| return transformed_data | |||
| # Rust | |||
| fn transform_data(input: Input) -> Result<Output, Error> { | |||
| // Transform data | |||
| Ok(transformed_data) | |||
| } | |||
| ``` | |||
| ### 2. State Management | |||
| ```python | |||
| # Python | |||
| class StatefulNode(Node): | |||
| def __init__(self): | |||
| super().__init__() | |||
| self.state = {} | |||
| def on_input(self, input_data): | |||
| # Update state | |||
| self.state.update(input_data) | |||
| return processed_data | |||
| # Rust | |||
| struct StatefulNode { | |||
| state: HashMap<String, Value>, | |||
| } | |||
| impl Node for StatefulNode { | |||
| fn on_input(&mut self, input: Input) -> Result<Output, Error> { | |||
| // Update state | |||
| self.state.extend(input.iter()); | |||
| Ok(processed_data) | |||
| } | |||
| } | |||
| ``` | |||
| ### 3. Resource Management | |||
| ```python | |||
| # Python | |||
| class ResourceNode(Node): | |||
| def __init__(self): | |||
| super().__init__() | |||
| self.resource = None | |||
| def setup(self): | |||
| self.resource = acquire_resource() | |||
| def cleanup(self): | |||
| if self.resource: | |||
| release_resource(self.resource) | |||
| # Rust | |||
| struct ResourceNode { | |||
| resource: Option<Resource>, | |||
| } | |||
| impl Node for ResourceNode { | |||
| fn setup(&mut self) -> Result<(), Error> { | |||
| self.resource = Some(acquire_resource()?); | |||
| Ok(()) | |||
| } | |||
| fn cleanup(&mut self) { | |||
| if let Some(resource) = self.resource.take() { | |||
| release_resource(resource); | |||
| } | |||
| } | |||
| } | |||
| ``` | |||
| ## Performance Optimization | |||
| ### 1. Memory Management | |||
| ```python | |||
| # Python | |||
| class OptimizedNode(Node): | |||
| def __init__(self): | |||
| super().__init__() | |||
| self.buffer = bytearray(1024) # Pre-allocate buffer | |||
| def on_input(self, input_data): | |||
| # Reuse buffer | |||
| return processed_data | |||
| # Rust | |||
| struct OptimizedNode { | |||
| buffer: Vec<u8>, | |||
| } | |||
| impl Node for OptimizedNode { | |||
| fn on_input(&mut self, input: Input) -> Result<Output, Error> { | |||
| // Reuse buffer | |||
| Ok(processed_data) | |||
| } | |||
| } | |||
| ``` | |||
| ### 2. Batch Processing | |||
| ```python | |||
| # Python | |||
| class BatchNode(Node): | |||
| def on_input(self, input_data): | |||
| # Process batch | |||
| return processed_batch | |||
| # Rust | |||
| struct BatchNode; | |||
| impl Node for BatchNode { | |||
| fn on_input(&mut self, input: Input) -> Result<Output, Error> { | |||
| // Process batch | |||
| Ok(processed_batch) | |||
| } | |||
| } | |||
| ``` | |||
| ## Best Practices | |||
| ### 1. Error Handling | |||
| - Use specific error types | |||
| - Provide meaningful error messages | |||
| - Implement proper error recovery | |||
| - Log errors appropriately | |||
| ### 2. Resource Management | |||
| - Clean up resources properly | |||
| - Use RAII patterns | |||
| - Implement proper shutdown | |||
| - Handle resource exhaustion | |||
| ### 3. Performance | |||
| - Minimize allocations | |||
| - Use efficient data structures | |||
| - Implement proper buffering | |||
| - Optimize hot paths | |||
| ### 4. Testing | |||
| - Write unit tests | |||
| - Implement integration tests | |||
| - Test error cases | |||
| - Benchmark performance | |||
| @@ -0,0 +1,165 @@ | |||
| # Python API Documentation | |||
| ## Overview | |||
| The Dora Python API provides a high-level interface for creating and managing Dora nodes and dataflows. | |||
| [See the full Python API documentation here](python/README.md) | |||
| ## Core Components | |||
| ### Node Class | |||
| ```python | |||
| from typing import Any, Dict, Optional | |||
| from dora import Node | |||
| class MyNode(Node): | |||
| def __init__(self, config: Dict[str, Any]): | |||
| super().__init__(config) | |||
| self.counter = 0 | |||
| async def on_input(self, input_data: Dict[str, Any]) -> Optional[Dict[str, Any]]: | |||
| """ | |||
| Process incoming data from other nodes. | |||
| Args: | |||
| input_data: Dictionary containing input data from connected nodes | |||
| Returns: | |||
| Optional[Dict[str, Any]]: Output data to be sent to connected nodes | |||
| """ | |||
| self.counter += 1 | |||
| return {"count": self.counter} | |||
| ``` | |||
| ### DataFlow Class | |||
| ```python | |||
| from dora import DataFlow | |||
| # Create a new dataflow | |||
| flow = DataFlow() | |||
| # Add nodes | |||
| node1 = flow.add_node("counter", MyNode, {"initial_value": 0}) | |||
| node2 = flow.add_node("processor", ProcessorNode, {"threshold": 100}) | |||
| # Connect nodes | |||
| flow.connect("counter.output", "processor.input") | |||
| # Run the dataflow | |||
| flow.run() | |||
| ``` | |||
| ## Type Hints and Return Values | |||
| ### Common Types | |||
| ```python | |||
| from typing import TypedDict, List, Union | |||
| class NodeConfig(TypedDict): | |||
| name: str | |||
| type: str | |||
| parameters: Dict[str, Any] | |||
| class NodeOutput(TypedDict): | |||
| data: Union[Dict[str, Any], List[Any]] | |||
| metadata: Dict[str, Any] | |||
| ``` | |||
| ### Node Methods | |||
| ```python | |||
| class Node: | |||
| async def on_input(self, data: Dict[str, Any]) -> Optional[Dict[str, Any]]: | |||
| """Process input data and return output.""" | |||
| pass | |||
| async def on_start(self) -> None: | |||
| """Called when the node starts.""" | |||
| pass | |||
| async def on_stop(self) -> None: | |||
| """Called when the node stops.""" | |||
| pass | |||
| ``` | |||
| ## Performance Considerations | |||
| ### Memory Management | |||
| - Use streaming for large datasets | |||
| - Implement proper cleanup in `on_stop` | |||
| - Monitor memory usage with built-in metrics | |||
| ### Async Operations | |||
| - Use `async/await` for I/O operations | |||
| - Implement proper error handling | |||
| - Use connection pooling for external services | |||
| ## Best Practices | |||
| ### Error Handling | |||
| ```python | |||
| class MyNode(Node): | |||
| async def on_input(self, data: Dict[str, Any]) -> Optional[Dict[str, Any]]: | |||
| try: | |||
| result = await self.process_data(data) | |||
| return result | |||
| except ValueError as e: | |||
| self.logger.error(f"Invalid input: {e}") | |||
| return None | |||
| except Exception as e: | |||
| self.logger.error(f"Unexpected error: {e}") | |||
| raise | |||
| ``` | |||
| ### Logging | |||
| ```python | |||
| class MyNode(Node): | |||
| def __init__(self, config: Dict[str, Any]): | |||
| super().__init__(config) | |||
| self.logger.info("Node initialized") | |||
| self.logger.debug(f"Config: {config}") | |||
| ``` | |||
| ### Configuration | |||
| ```python | |||
| class MyNode(Node): | |||
| def __init__(self, config: Dict[str, Any]): | |||
| super().__init__(config) | |||
| self.threshold = config.get("threshold", 100) | |||
| self.timeout = config.get("timeout", 30) | |||
| ``` | |||
| ## Examples | |||
| ### Basic Node | |||
| ```python | |||
| from dora import Node | |||
| from typing import Dict, Any, Optional | |||
| class EchoNode(Node): | |||
| async def on_input(self, data: Dict[str, Any]) -> Optional[Dict[str, Any]]: | |||
| return {"echo": data} | |||
| ``` | |||
| ### Data Transformation | |||
| ```python | |||
| class TransformNode(Node): | |||
| async def on_input(self, data: Dict[str, Any]) -> Optional[Dict[str, Any]]: | |||
| transformed = { | |||
| "timestamp": data.get("time"), | |||
| "value": float(data.get("value", 0)), | |||
| "metadata": data.get("metadata", {}) | |||
| } | |||
| return transformed | |||
| ``` | |||
| ### External Service Integration | |||
| ```python | |||
| import aiohttp | |||
| class APINode(Node): | |||
| async def on_input(self, data: Dict[str, Any]) -> Optional[Dict[str, Any]]: | |||
| async with aiohttp.ClientSession() as session: | |||
| async with session.get(self.config["api_url"]) as response: | |||
| result = await response.json() | |||
| return {"api_response": result} | |||
| ``` | |||
| @@ -0,0 +1,227 @@ | |||
| # Rust API Documentation | |||
| ## Overview | |||
| The Dora Rust API provides a low-level, high-performance interface for creating and managing Dora nodes and dataflows. | |||
| [See the full Rust API documentation here](rust/README.md) | |||
| ## Core Components | |||
| ### Node Trait | |||
| ```rust | |||
| use dora::prelude::*; | |||
| use serde::{Deserialize, Serialize}; | |||
| #[derive(Debug, Serialize, Deserialize)] | |||
| struct NodeConfig { | |||
| threshold: f64, | |||
| timeout: u64, | |||
| } | |||
| struct MyNode { | |||
| config: NodeConfig, | |||
| counter: u64, | |||
| } | |||
| impl Node for MyNode { | |||
| type Config = NodeConfig; | |||
| type Input = Value; | |||
| type Output = Value; | |||
| fn new(config: Self::Config) -> Self { | |||
| Self { | |||
| config, | |||
| counter: 0, | |||
| } | |||
| } | |||
| async fn on_input(&mut self, input: Self::Input) -> Result<Option<Self::Output>, Error> { | |||
| self.counter += 1; | |||
| Ok(Some(Value::Number(self.counter.into()))) | |||
| } | |||
| } | |||
| ``` | |||
| ### DataFlow Builder | |||
| ```rust | |||
| use dora::DataFlow; | |||
| let mut flow = DataFlow::new(); | |||
| // Add nodes | |||
| let counter = flow.add_node("counter", MyNode::new(NodeConfig { | |||
| threshold: 100.0, | |||
| timeout: 30, | |||
| })); | |||
| let processor = flow.add_node("processor", ProcessorNode::new(ProcessorConfig { | |||
| batch_size: 100, | |||
| ..Default::default() | |||
| })); | |||
| // Connect nodes | |||
| flow.connect("counter.output", "processor.input")?; | |||
| // Run the dataflow | |||
| flow.run().await?; | |||
| ``` | |||
| ## Type System | |||
| ### Common Types | |||
| ```rust | |||
| use serde::{Deserialize, Serialize}; | |||
| #[derive(Debug, Serialize, Deserialize)] | |||
| pub struct NodeMetadata { | |||
| pub name: String, | |||
| pub version: String, | |||
| pub timestamp: DateTime<Utc>, | |||
| } | |||
| #[derive(Debug, Serialize, Deserialize)] | |||
| pub enum Value { | |||
| Null, | |||
| Boolean(bool), | |||
| Number(f64), | |||
| String(String), | |||
| Array(Vec<Value>), | |||
| Object(HashMap<String, Value>), | |||
| } | |||
| ``` | |||
| ### Error Handling | |||
| ```rust | |||
| use thiserror::Error; | |||
| #[derive(Error, Debug)] | |||
| pub enum NodeError { | |||
| #[error("Invalid input: {0}")] | |||
| InvalidInput(String), | |||
| #[error("Processing error: {0}")] | |||
| ProcessingError(String), | |||
| #[error("IO error: {0}")] | |||
| IoError(#[from] std::io::Error), | |||
| } | |||
| type Result<T> = std::result::Result<T, NodeError>; | |||
| ``` | |||
| ## Performance Considerations | |||
| ### Memory Management | |||
| - Use zero-copy data transfer where possible | |||
| - Implement proper cleanup in `Drop` | |||
| - Use memory pools for frequently allocated types | |||
| ### Async Operations | |||
| - Use `tokio` for async runtime | |||
| - Implement proper backpressure handling | |||
| - Use connection pooling for external services | |||
| ## Best Practices | |||
| ### Error Handling | |||
| ```rust | |||
| impl Node for MyNode { | |||
| async fn on_input(&mut self, input: Self::Input) -> Result<Option<Self::Output>, Error> { | |||
| match self.process_data(input).await { | |||
| Ok(result) => Ok(Some(result)), | |||
| Err(e) => { | |||
| self.logger.error(&format!("Processing error: {}", e)); | |||
| Err(NodeError::ProcessingError(e.to_string())) | |||
| } | |||
| } | |||
| } | |||
| } | |||
| ``` | |||
| ### Logging | |||
| ```rust | |||
| use tracing::{info, debug, error}; | |||
| impl Node for MyNode { | |||
| fn new(config: Self::Config) -> Self { | |||
| info!("Initializing node with config: {:?}", config); | |||
| Self { | |||
| config, | |||
| counter: 0, | |||
| } | |||
| } | |||
| } | |||
| ``` | |||
| ### Configuration | |||
| ```rust | |||
| #[derive(Debug, Serialize, Deserialize)] | |||
| struct NodeConfig { | |||
| #[serde(default = "default_threshold")] | |||
| threshold: f64, | |||
| #[serde(default = "default_timeout")] | |||
| timeout: u64, | |||
| } | |||
| fn default_threshold() -> f64 { 100.0 } | |||
| fn default_timeout() -> u64 { 30 } | |||
| ``` | |||
| ## Examples | |||
| ### Basic Node | |||
| ```rust | |||
| struct EchoNode; | |||
| impl Node for EchoNode { | |||
| type Config = (); | |||
| type Input = Value; | |||
| type Output = Value; | |||
| fn new(_config: Self::Config) -> Self { | |||
| Self | |||
| } | |||
| async fn on_input(&mut self, input: Self::Input) -> Result<Option<Self::Output>, Error> { | |||
| Ok(Some(input)) | |||
| } | |||
| } | |||
| ``` | |||
| ### Data Transformation | |||
| ```rust | |||
| struct TransformNode; | |||
| impl Node for TransformNode { | |||
| type Config = TransformConfig; | |||
| type Input = RawData; | |||
| type Output = TransformedData; | |||
| async fn on_input(&mut self, input: Self::Input) -> Result<Option<Self::Output>, Error> { | |||
| let transformed = TransformedData { | |||
| timestamp: input.time, | |||
| value: input.value.parse()?, | |||
| metadata: input.metadata, | |||
| }; | |||
| Ok(Some(transformed)) | |||
| } | |||
| } | |||
| ``` | |||
| ### External Service Integration | |||
| ```rust | |||
| struct APINode { | |||
| client: reqwest::Client, | |||
| config: APIConfig, | |||
| } | |||
| impl Node for APINode { | |||
| async fn on_input(&mut self, input: Self::Input) -> Result<Option<Self::Output>, Error> { | |||
| let response = self.client | |||
| .get(&self.config.api_url) | |||
| .send() | |||
| .await? | |||
| .json() | |||
| .await?; | |||
| Ok(Some(response)) | |||
| } | |||
| } | |||
| ``` | |||
| @@ -0,0 +1,153 @@ | |||
| # Dora-RS Architecture | |||
| ## Overview | |||
| Dora-RS is a high-performance framework for running real-time multi-AI and multi-hardware applications. This document provides a detailed overview of the system architecture and its components. | |||
| ## Core Components | |||
| ### 1. Node System | |||
| - **Definition**: Independent processing units that handle specific tasks | |||
| - **Characteristics**: | |||
| - Lightweight and fast execution | |||
| - Inter-process communication | |||
| - Resource isolation | |||
| - Hot-reload capability | |||
| ### 2. Data Flow | |||
| - **Stream Processing**: Real-time data streaming between nodes | |||
| - **Data Types**: Support for various data formats | |||
| - **Buffering**: Efficient memory management | |||
| - **Backpressure Handling**: Automatic flow control | |||
| ### 3. Runtime Environment | |||
| - **Process Management**: Efficient node lifecycle management | |||
| - **Resource Allocation**: Dynamic resource distribution | |||
| - **Error Handling**: Robust error recovery | |||
| - **Monitoring**: Built-in performance metrics | |||
| ## System Design Principles | |||
| ### 1. Performance First | |||
| - Zero-copy data transfer | |||
| - Minimal overhead | |||
| - Efficient memory usage | |||
| - Optimized scheduling | |||
| ### 2. Reliability | |||
| - Fault tolerance | |||
| - Error recovery | |||
| - Data consistency | |||
| - System stability | |||
| ### 3. Extensibility | |||
| - Plugin architecture | |||
| - Custom node support | |||
| - API extensibility | |||
| - Configuration flexibility | |||
| ## Architecture Diagrams | |||
| ### System Overview | |||
| [Insert system overview diagram] | |||
| ### Data Flow | |||
| [Insert data flow diagram] | |||
| ### Node Communication | |||
| [Insert node communication diagram] | |||
| ## Implementation Details | |||
| ### 1. Core Runtime | |||
| - Written in Rust for maximum performance | |||
| - Thread-safe design | |||
| - Memory safety guarantees | |||
| - Efficient resource management | |||
| ### 2. Node API | |||
| - Python and Rust interfaces | |||
| - Type-safe communication | |||
| - Async/await support | |||
| - Error handling | |||
| ### 3. Data Processing | |||
| - Stream processing | |||
| - Batch processing | |||
| - Real-time processing | |||
| - Data transformation | |||
| ## Performance Considerations | |||
| ### 1. Memory Management | |||
| - Efficient allocation | |||
| - Garbage collection | |||
| - Memory pooling | |||
| - Resource cleanup | |||
| ### 2. CPU Utilization | |||
| - Load balancing | |||
| - Thread management | |||
| - Process scheduling | |||
| - Resource optimization | |||
| ### 3. Network Efficiency | |||
| - Protocol optimization | |||
| - Connection pooling | |||
| - Data compression | |||
| - Latency reduction | |||
| ## Security Architecture | |||
| ### 1. Node Isolation | |||
| - Process boundaries | |||
| - Resource limits | |||
| - Access control | |||
| - Sandboxing | |||
| ### 2. Data Protection | |||
| - Encryption | |||
| - Authentication | |||
| - Authorization | |||
| - Audit logging | |||
| ## Deployment Architecture | |||
| ### 1. Local Deployment | |||
| - Single machine setup | |||
| - Resource allocation | |||
| - Process management | |||
| - Monitoring | |||
| ### 2. Distributed Deployment | |||
| - Cluster management | |||
| - Load balancing | |||
| - Service discovery | |||
| - Fault tolerance | |||
| ## Future Considerations | |||
| ### 1. Scalability | |||
| - Horizontal scaling | |||
| - Vertical scaling | |||
| - Resource optimization | |||
| - Performance tuning | |||
| ### 2. Integration | |||
| - Third-party services | |||
| - Cloud platforms | |||
| - Container support | |||
| - API extensions | |||
| ## Best Practices | |||
| ### 1. Development | |||
| - Code organization | |||
| - Testing strategies | |||
| - Documentation | |||
| - Version control | |||
| ### 2. Deployment | |||
| - Configuration management | |||
| - Monitoring setup | |||
| - Backup strategies | |||
| - Recovery procedures | |||
| @@ -0,0 +1,28 @@ | |||
| # Community | |||
| Welcome to the Dora community! Here's how you can get involved and get help. | |||
| ## Getting Help | |||
| - [GitHub Issues](https://github.com/dora-rs/dora/issues) | |||
| - [Discord Community](https://discord.gg/dora) | |||
| - [Documentation](https://docs.dora-rs.org) | |||
| ## Support Channels | |||
| - Stack Overflow: [dora-rs](https://stackoverflow.com/questions/tagged/dora-rs) | |||
| - GitHub Discussions | |||
| - Community Forums | |||
| ## Contributing | |||
| - [How to Contribute](contributing.md) | |||
| - [Code of Conduct](code_of_conduct.md) | |||
| - [Development Guide](development.md) | |||
| ## Resources | |||
| - [Blog Posts](blog.md) | |||
| - [Tutorials](../examples/README.md) | |||
| - [API Documentation](../api/README.md) | |||
| ## Events | |||
| - [Upcoming Events](events.md) | |||
| - [Past Events](past_events.md) | |||
| - [Meetups](meetups.md) | |||
| @@ -0,0 +1,258 @@ | |||
| # Examples and Tutorials | |||
| This directory contains various examples and tutorials to help you get started with Dora. | |||
| ## Basic Examples | |||
| - [Basic Node Creation](basic/README.md) | |||
| - [Data Transformation](basic/data_transformation.md) | |||
| - [Simple Pipeline](basic/pipeline.md) | |||
| ## Advanced Examples | |||
| - [Image Processing Pipeline](advanced/image_processing.md) | |||
| - [Real-time Data Processing](advanced/realtime_processing.md) | |||
| - [API Integration](advanced/api_integration.md) | |||
| ## Performance Examples | |||
| - [Batch Processing](advanced/batch_processing.md) | |||
| - [Parallel Processing](advanced/parallel_processing.md) | |||
| - [Memory Optimization](advanced/memory_optimization.md) | |||
| ## End-to-End Examples | |||
| - [Complete Data Pipeline](advanced/complete_pipeline.md) | |||
| - [Real-time Monitoring System](advanced/monitoring_system.md) | |||
| ## Real-World Use Cases | |||
| ### Image Processing Pipeline | |||
| ```python | |||
| from dora import Node, DataFlow | |||
| import cv2 | |||
| import numpy as np | |||
| class ImageLoader(Node): | |||
| async def on_input(self, data): | |||
| image = cv2.imread(data["path"]) | |||
| return {"image": image} | |||
| class ImageProcessor(Node): | |||
| async def on_input(self, data): | |||
| image = data["image"] | |||
| # Apply image processing | |||
| processed = cv2.cvtColor(image, cv2.COLOR_BGR2GRAY) | |||
| return {"processed": processed} | |||
| class ImageSaver(Node): | |||
| async def on_input(self, data): | |||
| cv2.imwrite(data["output_path"], data["processed"]) | |||
| return {"status": "success"} | |||
| # Create pipeline | |||
| flow = DataFlow() | |||
| flow.add_node("loader", ImageLoader) | |||
| flow.add_node("processor", ImageProcessor) | |||
| flow.add_node("saver", ImageSaver) | |||
| # Connect nodes | |||
| flow.connect("loader.output", "processor.input") | |||
| flow.connect("processor.output", "saver.input") | |||
| # Run pipeline | |||
| flow.run() | |||
| ``` | |||
| ### Real-time Data Processing | |||
| ```python | |||
| class SensorReader(Node): | |||
| async def on_input(self, data): | |||
| # Read from sensor | |||
| value = read_sensor() | |||
| return {"value": value} | |||
| class DataAnalyzer(Node): | |||
| async def on_input(self, data): | |||
| value = data["value"] | |||
| # Analyze data | |||
| analysis = analyze_data(value) | |||
| return {"analysis": analysis} | |||
| class AlertSystem(Node): | |||
| async def on_input(self, data): | |||
| if data["analysis"]["anomaly"]: | |||
| send_alert(data["analysis"]) | |||
| return {"status": "processed"} | |||
| ``` | |||
| ### API Integration | |||
| ```python | |||
| import aiohttp | |||
| class APIClient(Node): | |||
| async def on_input(self, data): | |||
| async with aiohttp.ClientSession() as session: | |||
| async with session.get(data["url"]) as response: | |||
| result = await response.json() | |||
| return {"api_response": result} | |||
| class ResponseProcessor(Node): | |||
| async def on_input(self, data): | |||
| response = data["api_response"] | |||
| # Process API response | |||
| processed = process_response(response) | |||
| return {"processed": processed} | |||
| ``` | |||
| ## Performance Optimization Examples | |||
| ### Batch Processing | |||
| ```python | |||
| class BatchProcessor(Node): | |||
| def __init__(self, config): | |||
| super().__init__(config) | |||
| self.batch_size = config.get("batch_size", 100) | |||
| self.buffer = [] | |||
| async def on_input(self, data): | |||
| self.buffer.append(data) | |||
| if len(self.buffer) >= self.batch_size: | |||
| result = process_batch(self.buffer) | |||
| self.buffer = [] | |||
| return {"batch_result": result} | |||
| return None | |||
| ``` | |||
| ### Parallel Processing | |||
| ```python | |||
| import asyncio | |||
| class ParallelProcessor(Node): | |||
| async def on_input(self, data): | |||
| tasks = [] | |||
| for item in data["items"]: | |||
| tasks.append(process_item(item)) | |||
| results = await asyncio.gather(*tasks) | |||
| return {"results": results} | |||
| ``` | |||
| ### Memory Optimization | |||
| ```python | |||
| class MemoryEfficientNode(Node): | |||
| def __init__(self, config): | |||
| super().__init__(config) | |||
| self.pool = MemoryPool() | |||
| async def on_input(self, data): | |||
| with self.pool.allocate() as buffer: | |||
| result = process_with_buffer(data, buffer) | |||
| return {"result": result} | |||
| ``` | |||
| ## End-to-End Examples | |||
| ### Complete Data Pipeline | |||
| ```python | |||
| from dora import Node, DataFlow | |||
| import pandas as pd | |||
| from sklearn.preprocessing import StandardScaler | |||
| class DataLoader(Node): | |||
| async def on_input(self, data): | |||
| df = pd.read_csv(data["file_path"]) | |||
| return {"data": df} | |||
| class DataPreprocessor(Node): | |||
| async def on_input(self, data): | |||
| df = data["data"] | |||
| scaler = StandardScaler() | |||
| scaled_data = scaler.fit_transform(df) | |||
| return {"scaled_data": scaled_data} | |||
| class ModelPredictor(Node): | |||
| async def on_input(self, data): | |||
| predictions = model.predict(data["scaled_data"]) | |||
| return {"predictions": predictions} | |||
| class ResultAggregator(Node): | |||
| async def on_input(self, data): | |||
| results = aggregate_results(data["predictions"]) | |||
| return {"final_results": results} | |||
| # Create and run pipeline | |||
| flow = DataFlow() | |||
| flow.add_node("loader", DataLoader) | |||
| flow.add_node("preprocessor", DataPreprocessor) | |||
| flow.add_node("predictor", ModelPredictor) | |||
| flow.add_node("aggregator", ResultAggregator) | |||
| # Connect nodes | |||
| flow.connect("loader.output", "preprocessor.input") | |||
| flow.connect("preprocessor.output", "predictor.input") | |||
| flow.connect("predictor.output", "aggregator.input") | |||
| # Run pipeline | |||
| flow.run() | |||
| ``` | |||
| ### Real-time Monitoring System | |||
| ```python | |||
| class MetricsCollector(Node): | |||
| async def on_input(self, data): | |||
| metrics = collect_system_metrics() | |||
| return {"metrics": metrics} | |||
| class AnomalyDetector(Node): | |||
| async def on_input(self, data): | |||
| anomalies = detect_anomalies(data["metrics"]) | |||
| return {"anomalies": anomalies} | |||
| class AlertManager(Node): | |||
| async def on_input(self, data): | |||
| if data["anomalies"]: | |||
| send_alerts(data["anomalies"]) | |||
| return {"status": "processed"} | |||
| class DashboardUpdater(Node): | |||
| async def on_input(self, data): | |||
| update_dashboard(data["metrics"]) | |||
| return {"status": "updated"} | |||
| ``` | |||
| ## Best Practices | |||
| ### Error Handling | |||
| ```python | |||
| class RobustNode(Node): | |||
| async def on_input(self, data): | |||
| try: | |||
| result = process_data(data) | |||
| return {"result": result} | |||
| except ValueError as e: | |||
| self.logger.error(f"Invalid input: {e}") | |||
| return None | |||
| except Exception as e: | |||
| self.logger.error(f"Unexpected error: {e}") | |||
| raise | |||
| ``` | |||
| ### Configuration Management | |||
| ```python | |||
| class ConfigurableNode(Node): | |||
| def __init__(self, config): | |||
| super().__init__(config) | |||
| self.threshold = config.get("threshold", 100) | |||
| self.timeout = config.get("timeout", 30) | |||
| self.retry_count = config.get("retry_count", 3) | |||
| ``` | |||
| ### Logging and Monitoring | |||
| ```python | |||
| class MonitoredNode(Node): | |||
| async def on_input(self, data): | |||
| self.logger.info("Processing input") | |||
| start_time = time.time() | |||
| result = process_data(data) | |||
| duration = time.time() - start_time | |||
| self.logger.info(f"Processing completed in {duration:.2f}s") | |||
| return {"result": result} | |||
| ``` | |||
| @@ -0,0 +1,42 @@ | |||
| # Advanced Examples | |||
| This directory contains complex examples that demonstrate advanced features and real-world use cases of Dora. | |||
| ## Examples | |||
| ### Image Processing Pipeline | |||
| [image_processing.md](image_processing.md) | |||
| - Real-time camera feed processing | |||
| - Multi-node pipeline architecture | |||
| - Async processing with OpenCV | |||
| - Error handling and resource management | |||
| - Performance optimization techniques | |||
| ## Advanced Concepts | |||
| - Async/await processing | |||
| - Binary data handling | |||
| - External library integration | |||
| - Complex data flow | |||
| - Error recovery | |||
| - Performance monitoring | |||
| ## Getting Started | |||
| 1. Review the prerequisites for each example | |||
| 2. Install required dependencies | |||
| 3. Follow the detailed setup instructions | |||
| 4. Run and experiment with the examples | |||
| ## Learning Objectives | |||
| - Advanced node patterns | |||
| - Complex data flow design | |||
| - Performance optimization | |||
| - Error handling strategies | |||
| - Resource management | |||
| - External system integration | |||
| ## Next Steps | |||
| - Implement your own advanced features | |||
| - Optimize performance | |||
| - Add monitoring and metrics | |||
| - Integrate with other systems | |||
| - Check out the [API Documentation](../../api/README.md) for more details | |||
| @@ -0,0 +1,225 @@ | |||
| # Image Processing Pipeline | |||
| This advanced example demonstrates a real-world image processing pipeline using Dora. | |||
| ## Overview | |||
| This example shows how to: | |||
| 1. Create a complex multi-node pipeline | |||
| 2. Handle binary data (images) | |||
| 3. Use external libraries (OpenCV) | |||
| 4. Implement error handling | |||
| 5. Use async processing | |||
| ## Project Structure | |||
| ``` | |||
| image_processing/ | |||
| ├── dora.yml | |||
| ├── requirements.txt | |||
| └── nodes/ | |||
| ├── camera.py | |||
| ├── preprocessor.py | |||
| ├── detector.py | |||
| └── visualizer.py | |||
| ``` | |||
| ## Implementation | |||
| ### 1. Dependencies (requirements.txt) | |||
| ``` | |||
| opencv-python>=4.5.0 | |||
| numpy>=1.19.0 | |||
| ``` | |||
| ### 2. Camera Node (nodes/camera.py) | |||
| ```python | |||
| from dora import AsyncNode | |||
| import cv2 | |||
| import numpy as np | |||
| class CameraNode(AsyncNode): | |||
| def __init__(self): | |||
| super().__init__() | |||
| self.cap = None | |||
| async def setup(self): | |||
| self.cap = cv2.VideoCapture(0) | |||
| if not self.cap.isOpened(): | |||
| raise RuntimeError("Failed to open camera") | |||
| async def on_input(self, input_data): | |||
| ret, frame = self.cap.read() | |||
| if not ret: | |||
| raise RuntimeError("Failed to read frame") | |||
| # Convert to RGB | |||
| frame_rgb = cv2.cvtColor(frame, cv2.COLOR_BGR2RGB) | |||
| return {"image": frame_rgb.tobytes(), "shape": frame_rgb.shape} | |||
| async def cleanup(self): | |||
| if self.cap: | |||
| self.cap.release() | |||
| ``` | |||
| ### 3. Preprocessor Node (nodes/preprocessor.py) | |||
| ```python | |||
| from dora import Node | |||
| import numpy as np | |||
| class PreprocessorNode(Node): | |||
| def __init__(self): | |||
| super().__init__() | |||
| def on_input(self, input_data): | |||
| # Convert bytes back to numpy array | |||
| image = np.frombuffer(input_data["image"], dtype=np.uint8) | |||
| image = image.reshape(input_data["shape"]) | |||
| # Resize image | |||
| resized = cv2.resize(image, (640, 480)) | |||
| # Normalize | |||
| normalized = resized.astype(np.float32) / 255.0 | |||
| return { | |||
| "image": normalized.tobytes(), | |||
| "shape": normalized.shape | |||
| } | |||
| ``` | |||
| ### 4. Detector Node (nodes/detector.py) | |||
| ```python | |||
| from dora import Node | |||
| import numpy as np | |||
| class DetectorNode(Node): | |||
| def __init__(self): | |||
| super().__init__() | |||
| # Load your ML model here | |||
| self.model = None | |||
| def on_input(self, input_data): | |||
| # Convert to numpy array | |||
| image = np.frombuffer(input_data["image"], dtype=np.float32) | |||
| image = image.reshape(input_data["shape"]) | |||
| # Run detection (simulated) | |||
| detections = self.simulate_detection(image) | |||
| return { | |||
| "detections": detections, | |||
| "image": input_data["image"], | |||
| "shape": input_data["shape"] | |||
| } | |||
| def simulate_detection(self, image): | |||
| # Simulate object detection | |||
| return [ | |||
| {"class": "person", "confidence": 0.95, "bbox": [100, 100, 200, 200]}, | |||
| {"class": "car", "confidence": 0.88, "bbox": [300, 150, 400, 250]} | |||
| ] | |||
| ``` | |||
| ### 5. Visualizer Node (nodes/visualizer.py) | |||
| ```python | |||
| from dora import Node | |||
| import cv2 | |||
| import numpy as np | |||
| class VisualizerNode(Node): | |||
| def __init__(self): | |||
| super().__init__() | |||
| def on_input(self, input_data): | |||
| # Convert back to numpy array | |||
| image = np.frombuffer(input_data["image"], dtype=np.float32) | |||
| image = image.reshape(input_data["shape"]) | |||
| # Convert to uint8 for display | |||
| image = (image * 255).astype(np.uint8) | |||
| # Draw detections | |||
| for det in input_data["detections"]: | |||
| bbox = det["bbox"] | |||
| cv2.rectangle(image, (bbox[0], bbox[1]), (bbox[2], bbox[3]), (0, 255, 0), 2) | |||
| cv2.putText(image, f"{det['class']} {det['confidence']:.2f}", | |||
| (bbox[0], bbox[1] - 10), cv2.FONT_HERSHEY_SIMPLEX, 0.5, (0, 255, 0), 2) | |||
| # Display image | |||
| cv2.imshow("Detection Results", image) | |||
| cv2.waitKey(1) | |||
| return None | |||
| ``` | |||
| ### 6. Graph Configuration (dora.yml) | |||
| ```yaml | |||
| nodes: | |||
| camera: | |||
| inputs: [] | |||
| outputs: ["image", "shape"] | |||
| python: nodes/camera.py | |||
| preprocessor: | |||
| inputs: ["image", "shape"] | |||
| outputs: ["image", "shape"] | |||
| python: nodes/preprocessor.py | |||
| detector: | |||
| inputs: ["image", "shape"] | |||
| outputs: ["detections", "image", "shape"] | |||
| python: nodes/detector.py | |||
| visualizer: | |||
| inputs: ["detections", "image", "shape"] | |||
| outputs: [] | |||
| python: nodes/visualizer.py | |||
| edges: | |||
| - from: camera | |||
| to: preprocessor | |||
| data: [image, shape] | |||
| - from: preprocessor | |||
| to: detector | |||
| data: [image, shape] | |||
| - from: detector | |||
| to: visualizer | |||
| data: [detections, image, shape] | |||
| ``` | |||
| ## Running the Example | |||
| 1. Install dependencies: | |||
| ```bash | |||
| pip install -r requirements.txt | |||
| ``` | |||
| 2. Start the application: | |||
| ```bash | |||
| dora run | |||
| ``` | |||
| 3. You should see a window showing the camera feed with detected objects. | |||
| ## Understanding the Example | |||
| ### Advanced Concepts | |||
| - **Async Processing**: Camera node uses async/await for non-blocking I/O | |||
| - **Binary Data Handling**: Efficient image data transfer between nodes | |||
| - **Error Handling**: Proper cleanup and error propagation | |||
| - **External Libraries**: Integration with OpenCV | |||
| - **Complex Data Flow**: Multi-stage processing pipeline | |||
| ### Performance Considerations | |||
| - Zero-copy data transfer where possible | |||
| - Efficient image format conversions | |||
| - Proper resource cleanup | |||
| - Async processing for I/O operations | |||
| ## Next Steps | |||
| - Add real ML model integration | |||
| - Implement frame rate control | |||
| - Add configuration options | |||
| - Implement error recovery | |||
| - Add performance monitoring | |||
| @@ -0,0 +1,379 @@ | |||
| # Basic Examples | |||
| This guide provides a collection of basic examples to help you get started with Dora-RS. | |||
| ## Simple Data Processing Node | |||
| ### Python Example | |||
| ```python | |||
| from dora import Node, Config | |||
| class SimpleProcessor(Node): | |||
| def __init__(self): | |||
| super().__init__() | |||
| self.counter = 0 | |||
| def on_input(self, input_data): | |||
| self.counter += 1 | |||
| return { | |||
| "count": self.counter, | |||
| "processed_data": input_data | |||
| } | |||
| # Configuration | |||
| config = Config( | |||
| name="simple_processor", | |||
| inputs=["input_stream"], | |||
| outputs=["output_stream"] | |||
| ) | |||
| # Run the node | |||
| node = SimpleProcessor() | |||
| node.run(config) | |||
| ``` | |||
| ### Rust Example | |||
| ```rust | |||
| use dora_node_api::{Node, Config}; | |||
| struct SimpleProcessor { | |||
| counter: u64, | |||
| } | |||
| impl Node for SimpleProcessor { | |||
| fn on_input(&mut self, input: Input) -> Result<Output, Error> { | |||
| self.counter += 1; | |||
| Ok(Output::new() | |||
| .with("count", self.counter) | |||
| .with("processed_data", input)) | |||
| } | |||
| } | |||
| // Configuration | |||
| let config = Config::new("simple_processor") | |||
| .with_inputs(vec!["input_stream"]) | |||
| .with_outputs(vec!["output_stream"]); | |||
| // Run the node | |||
| let mut node = SimpleProcessor { counter: 0 }; | |||
| node.run(config)?; | |||
| ``` | |||
| ## Data Transformation Node | |||
| ### Python Example | |||
| ```python | |||
| from dora import Node, DataType | |||
| class DataTransformer(Node): | |||
| def __init__(self): | |||
| super().__init__() | |||
| def on_input(self, input_data): | |||
| # Transform data | |||
| transformed = { | |||
| "uppercase": input_data["text"].upper(), | |||
| "length": len(input_data["text"]), | |||
| "words": len(input_data["text"].split()) | |||
| } | |||
| return transformed | |||
| # Configuration | |||
| config = Config( | |||
| name="data_transformer", | |||
| inputs=["text_input"], | |||
| outputs=["transformed_output"], | |||
| input_types={ | |||
| "text_input": DataType.JSON | |||
| } | |||
| ) | |||
| # Run the node | |||
| node = DataTransformer() | |||
| node.run(config) | |||
| ``` | |||
| ### Rust Example | |||
| ```rust | |||
| use dora_node_api::{Node, Config, DataType}; | |||
| struct DataTransformer; | |||
| impl Node for DataTransformer { | |||
| fn on_input(&mut self, input: Input) -> Result<Output, Error> { | |||
| let text = input.get::<String>("text")?; | |||
| let transformed = json!({ | |||
| "uppercase": text.to_uppercase(), | |||
| "length": text.len(), | |||
| "words": text.split_whitespace().count() | |||
| }); | |||
| Ok(Output::new().with("transformed_output", transformed)) | |||
| } | |||
| } | |||
| // Configuration | |||
| let config = Config::new("data_transformer") | |||
| .with_inputs(vec!["text_input"]) | |||
| .with_outputs(vec!["transformed_output"]) | |||
| .with_input_types(hashmap! { | |||
| "text_input".to_string() => DataType::Json, | |||
| }); | |||
| // Run the node | |||
| let node = DataTransformer; | |||
| node.run(config)?; | |||
| ``` | |||
| ## Stateful Node | |||
| ### Python Example | |||
| ```python | |||
| from dora import Node | |||
| from collections import deque | |||
| class StatefulProcessor(Node): | |||
| def __init__(self, window_size=5): | |||
| super().__init__() | |||
| self.window = deque(maxlen=window_size) | |||
| def on_input(self, input_data): | |||
| self.window.append(input_data["value"]) | |||
| # Calculate statistics | |||
| stats = { | |||
| "average": sum(self.window) / len(self.window), | |||
| "min": min(self.window), | |||
| "max": max(self.window), | |||
| "count": len(self.window) | |||
| } | |||
| return stats | |||
| # Configuration | |||
| config = Config( | |||
| name="stateful_processor", | |||
| inputs=["value_stream"], | |||
| outputs=["statistics"] | |||
| ) | |||
| # Run the node | |||
| node = StatefulProcessor(window_size=5) | |||
| node.run(config) | |||
| ``` | |||
| ### Rust Example | |||
| ```rust | |||
| use dora_node_api::{Node, Config}; | |||
| use std::collections::VecDeque; | |||
| struct StatefulProcessor { | |||
| window: VecDeque<f64>, | |||
| } | |||
| impl Node for StatefulProcessor { | |||
| fn on_input(&mut self, input: Input) -> Result<Output, Error> { | |||
| let value = input.get::<f64>("value")?; | |||
| self.window.push_back(value); | |||
| // Calculate statistics | |||
| let stats = json!({ | |||
| "average": self.window.iter().sum::<f64>() / self.window.len() as f64, | |||
| "min": self.window.iter().fold(f64::INFINITY, |a, &b| a.min(b)), | |||
| "max": self.window.iter().fold(f64::NEG_INFINITY, |a, &b| a.max(b)), | |||
| "count": self.window.len() | |||
| }); | |||
| Ok(Output::new().with("statistics", stats)) | |||
| } | |||
| } | |||
| // Configuration | |||
| let config = Config::new("stateful_processor") | |||
| .with_inputs(vec!["value_stream"]) | |||
| .with_outputs(vec!["statistics"]); | |||
| // Run the node | |||
| let mut node = StatefulProcessor { | |||
| window: VecDeque::with_capacity(5) | |||
| }; | |||
| node.run(config)?; | |||
| ``` | |||
| ## Error Handling Node | |||
| ### Python Example | |||
| ```python | |||
| from dora import Node, NodeError | |||
| class SafeProcessor(Node): | |||
| def __init__(self): | |||
| super().__init__() | |||
| def on_input(self, input_data): | |||
| try: | |||
| # Validate input | |||
| if "value" not in input_data: | |||
| raise NodeError("Missing required field: value") | |||
| value = float(input_data["value"]) | |||
| if value < 0: | |||
| raise NodeError("Value must be non-negative") | |||
| # Process data | |||
| result = { | |||
| "square": value ** 2, | |||
| "cube": value ** 3, | |||
| "sqrt": value ** 0.5 | |||
| } | |||
| return result | |||
| except ValueError as e: | |||
| raise NodeError(f"Invalid value format: {str(e)}") | |||
| except Exception as e: | |||
| raise NodeError(f"Processing error: {str(e)}") | |||
| # Configuration | |||
| config = Config( | |||
| name="safe_processor", | |||
| inputs=["value_input"], | |||
| outputs=["results"] | |||
| ) | |||
| # Run the node | |||
| node = SafeProcessor() | |||
| node.run(config) | |||
| ``` | |||
| ### Rust Example | |||
| ```rust | |||
| use dora_node_api::{Node, Config, NodeError}; | |||
| struct SafeProcessor; | |||
| impl Node for SafeProcessor { | |||
| fn on_input(&mut self, input: Input) -> Result<Output, Error> { | |||
| // Validate input | |||
| let value = input.get::<f64>("value") | |||
| .map_err(|e| NodeError::new(format!("Missing required field: value: {}", e)))?; | |||
| if value < 0.0 { | |||
| return Err(NodeError::new("Value must be non-negative".to_string())); | |||
| } | |||
| // Process data | |||
| let result = json!({ | |||
| "square": value * value, | |||
| "cube": value * value * value, | |||
| "sqrt": value.sqrt() | |||
| }); | |||
| Ok(Output::new().with("results", result)) | |||
| } | |||
| } | |||
| // Configuration | |||
| let config = Config::new("safe_processor") | |||
| .with_inputs(vec!["value_input"]) | |||
| .with_outputs(vec!["results"]); | |||
| // Run the node | |||
| let node = SafeProcessor; | |||
| node.run(config)?; | |||
| ``` | |||
| ## Resource Management Node | |||
| ### Python Example | |||
| ```python | |||
| from dora import Node | |||
| import psutil | |||
| class ResourceMonitor(Node): | |||
| def __init__(self): | |||
| super().__init__() | |||
| self.process = None | |||
| def setup(self): | |||
| self.process = psutil.Process() | |||
| def on_input(self, input_data): | |||
| # Monitor system resources | |||
| stats = { | |||
| "cpu_percent": self.process.cpu_percent(), | |||
| "memory_percent": self.process.memory_percent(), | |||
| "threads": self.process.num_threads(), | |||
| "open_files": len(self.process.open_files()) | |||
| } | |||
| return stats | |||
| def cleanup(self): | |||
| self.process = None | |||
| # Configuration | |||
| config = Config( | |||
| name="resource_monitor", | |||
| inputs=["trigger"], | |||
| outputs=["stats"] | |||
| ) | |||
| # Run the node | |||
| node = ResourceMonitor() | |||
| node.run(config) | |||
| ``` | |||
| ### Rust Example | |||
| ```rust | |||
| use dora_node_api::{Node, Config}; | |||
| use sysinfo::{System, SystemExt, ProcessExt}; | |||
| struct ResourceMonitor { | |||
| sys: System, | |||
| } | |||
| impl Node for ResourceMonitor { | |||
| fn setup(&mut self) -> Result<(), Error> { | |||
| self.sys.refresh_all(); | |||
| Ok(()) | |||
| } | |||
| fn on_input(&mut self, _input: Input) -> Result<Output, Error> { | |||
| self.sys.refresh_all(); | |||
| let current_process = self.sys.get_current_pid() | |||
| .ok_or_else(|| NodeError::new("Failed to get current process".to_string()))?; | |||
| let process = self.sys.get_process(current_process) | |||
| .ok_or_else(|| NodeError::new("Failed to get process info".to_string()))?; | |||
| let stats = json!({ | |||
| "cpu_percent": process.cpu_usage(), | |||
| "memory_percent": process.memory() as f64 / self.sys.total_memory() as f64 * 100.0, | |||
| "threads": process.thread_count(), | |||
| "open_files": process.open_files().len() | |||
| }); | |||
| Ok(Output::new().with("stats", stats)) | |||
| } | |||
| } | |||
| // Configuration | |||
| let config = Config::new("resource_monitor") | |||
| .with_inputs(vec!["trigger"]) | |||
| .with_outputs(vec!["stats"]); | |||
| // Run the node | |||
| let mut node = ResourceMonitor { | |||
| sys: System::new_all() | |||
| }; | |||
| node.run(config)?; | |||
| ``` | |||
| ## Next Steps | |||
| 1. Try modifying these examples to suit your needs | |||
| 2. Explore the [Advanced Examples](./advanced.md) | |||
| 3. Check out the [API Documentation](../api/README.md) | |||
| 4. Join our [Community](../community.md) for more examples and support | |||
| @@ -0,0 +1,30 @@ | |||
| # Basic Examples | |||
| This directory contains simple examples that demonstrate the core concepts of Dora. | |||
| ## Examples | |||
| ### Hello World | |||
| [hello_world.md](hello_world.md) | |||
| - Demonstrates basic node creation | |||
| - Shows simple data flow between nodes | |||
| - Introduces core Dora concepts | |||
| - Perfect for getting started | |||
| ## Getting Started | |||
| 1. Choose an example from the list above | |||
| 2. Follow the step-by-step instructions | |||
| 3. Run the example using `dora run` | |||
| 4. Experiment with modifications | |||
| ## Learning Objectives | |||
| - Understanding node structure | |||
| - Basic data flow | |||
| - Configuration with YAML | |||
| - Running Dora applications | |||
| ## Next Steps | |||
| - Try modifying the examples | |||
| - Add more nodes | |||
| - Experiment with different data types | |||
| - Check out the [Advanced Examples](../advanced/README.md) | |||
| @@ -0,0 +1,107 @@ | |||
| # Hello World Example | |||
| This is a simple example that demonstrates the basic concepts of Dora. | |||
| ## Overview | |||
| This example shows how to: | |||
| 1. Create a basic Dora application | |||
| 2. Define a simple node | |||
| 3. Set up data flow between nodes | |||
| 4. Run the application | |||
| ## Project Structure | |||
| ``` | |||
| hello_world/ | |||
| ├── dora.yml | |||
| └── nodes/ | |||
| ├── source.py | |||
| └── sink.py | |||
| ``` | |||
| ## Implementation | |||
| ### 1. Create the Project | |||
| ```bash | |||
| dora new hello_world | |||
| cd hello_world | |||
| ``` | |||
| ### 2. Create Source Node (nodes/source.py) | |||
| ```python | |||
| from dora import Node | |||
| class SourceNode(Node): | |||
| def __init__(self): | |||
| super().__init__() | |||
| self.counter = 0 | |||
| def on_input(self, input_data): | |||
| self.counter += 1 | |||
| return {"message": f"Hello World {self.counter}!"} | |||
| ``` | |||
| ### 3. Create Sink Node (nodes/sink.py) | |||
| ```python | |||
| from dora import Node | |||
| class SinkNode(Node): | |||
| def __init__(self): | |||
| super().__init__() | |||
| def on_input(self, input_data): | |||
| print(f"Received: {input_data['message']}") | |||
| return None | |||
| ``` | |||
| ### 4. Configure the Graph (dora.yml) | |||
| ```yaml | |||
| nodes: | |||
| source: | |||
| inputs: [] | |||
| outputs: ["message"] | |||
| python: nodes/source.py | |||
| sink: | |||
| inputs: ["message"] | |||
| outputs: [] | |||
| python: nodes/sink.py | |||
| edges: | |||
| - from: source | |||
| to: sink | |||
| data: message | |||
| ``` | |||
| ## Running the Example | |||
| 1. Start the application: | |||
| ```bash | |||
| dora run | |||
| ``` | |||
| 2. You should see output like: | |||
| ``` | |||
| Received: Hello World 1! | |||
| Received: Hello World 2! | |||
| Received: Hello World 3! | |||
| ... | |||
| ``` | |||
| ## Understanding the Example | |||
| ### Data Flow | |||
| 1. The `SourceNode` generates messages with a counter | |||
| 2. Messages flow through the edge to the `SinkNode` | |||
| 3. The `SinkNode` prints the received messages | |||
| ### Key Concepts | |||
| - **Nodes**: Independent processing units | |||
| - **Edges**: Define data flow between nodes | |||
| - **Data Types**: Messages are passed as dictionaries | |||
| - **Node Lifecycle**: Nodes are initialized and process data continuously | |||
| ## Next Steps | |||
| - Try modifying the message format | |||
| - Add more nodes to the graph | |||
| - Experiment with different data types | |||
| - Check out the [Advanced Examples](../advanced/README.md) for more complex scenarios | |||
| @@ -0,0 +1,74 @@ | |||
| # Getting Started with Dora-RS | |||
| ## Prerequisites | |||
| - Rust toolchain (latest stable version) | |||
| - Python 3.8 or higher | |||
| - Git | |||
| ## Quick Start | |||
| 1. Clone the repository: | |||
| ```bash | |||
| git clone https://github.com/dora-rs/dora.git | |||
| cd dora | |||
| ``` | |||
| 2. Install dependencies: | |||
| ```bash | |||
| # On Windows | |||
| ./install.ps1 | |||
| # On Unix-like systems | |||
| ./install.sh | |||
| ``` | |||
| 3. Run your first Dora application: | |||
| ```bash | |||
| # Example command will be added here | |||
| ``` | |||
| ## System Requirements | |||
| - Operating System: Windows, macOS, or Linux | |||
| - Memory: Minimum 4GB RAM (8GB recommended) | |||
| - Storage: At least 1GB free space | |||
| - Network: Internet connection for initial setup | |||
| ## Installation Guide | |||
| ### Windows Installation | |||
| 1. Install Rust using rustup | |||
| 2. Install Python from python.org | |||
| 3. Run the installation script: | |||
| ```powershell | |||
| ./install.ps1 | |||
| ``` | |||
| ### Unix-like Systems Installation | |||
| 1. Install Rust using rustup | |||
| 2. Install Python using your package manager | |||
| 3. Run the installation script: | |||
| ```bash | |||
| ./install.sh | |||
| ``` | |||
| ## Troubleshooting Common Issues | |||
| ### Installation Issues | |||
| - **Problem**: Rust installation fails | |||
| **Solution**: Ensure you have the latest rustup version and try again | |||
| - **Problem**: Python dependencies not found | |||
| **Solution**: Verify Python installation and try reinstalling dependencies | |||
| ### Runtime Issues | |||
| - **Problem**: Node connection errors | |||
| **Solution**: Check network connectivity and firewall settings | |||
| - **Problem**: Performance issues | |||
| **Solution**: Review system resources and optimize configuration | |||
| ## Next Steps | |||
| 1. Read the [Architecture Guide](./architecture.md) | |||
| 2. Try the [Basic Examples](./examples/basic.md) | |||
| 3. Explore the [API Documentation](./api/README.md) | |||
| 4. Join our [Community](./community.md) | |||
| @@ -0,0 +1,55 @@ | |||
| # Quick Start Guide | |||
| ## Prerequisites | |||
| - Rust toolchain (latest stable version) | |||
| - Python 3.8 or higher | |||
| - Git | |||
| ## System Requirements | |||
| - Operating System: Windows, macOS, or Linux | |||
| - Memory: Minimum 4GB RAM (8GB recommended) | |||
| - Storage: At least 1GB free space | |||
| - Network: Internet connection for initial setup | |||
| ## Quick Installation | |||
| ### Windows | |||
| ```powershell | |||
| # Install Rust | |||
| winget install Rust.Rustup | |||
| # Clone and setup Dora | |||
| git clone https://github.com/dora-rs/dora.git | |||
| cd dora | |||
| ./install.ps1 | |||
| ``` | |||
| ### macOS/Linux | |||
| ```bash | |||
| # Install Rust | |||
| curl --proto '=https' --tlsv1.2 -sSf https://sh.rustup.rs | sh | |||
| # Clone and setup Dora | |||
| git clone https://github.com/dora-rs/dora.git | |||
| cd dora | |||
| ./install.sh | |||
| ``` | |||
| ## Your First Dora Application | |||
| 1. Create a new project: | |||
| ```bash | |||
| dora new my-first-app | |||
| cd my-first-app | |||
| ``` | |||
| 2. Run the example: | |||
| ```bash | |||
| dora run | |||
| ``` | |||
| ## Next Steps | |||
| - [Basic Examples](../examples/basic/README.md) | |||
| - [Python API Guide](../api/python/README.md) | |||
| - [Rust API Guide](../api/rust/README.md) | |||
| - [Architecture Overview](../architecture/README.md) | |||
| @@ -0,0 +1,23 @@ | |||
| # Troubleshooting Guide | |||
| This guide helps you solve common issues you might encounter while using Dora. | |||
| ## Common Issues | |||
| - [Installation Problems](installation.md) | |||
| - [Runtime Issues](runtime.md) | |||
| - [Performance Problems](performance.md) | |||
| - [Network Issues](network.md) | |||
| ## Debugging Tools | |||
| - [Logging Guide](logging.md) | |||
| - [Profiling Tools](profiling.md) | |||
| - [Network Debugging](network_debugging.md) | |||
| ## Error Messages | |||
| - [Common Error Messages](error_messages.md) | |||
| - [Error Solutions](error_solutions.md) | |||
| ## Getting Help | |||
| - [Community Support](../community/README.md) | |||
| - [Reporting Issues](reporting_issues.md) | |||
| - [FAQ](faq.md) | |||