Skip to main content

Documentation Index

Fetch the complete documentation index at: https://hyrex.io/docs/llms.txt

Use this file to discover all available pages before exploring further.

Hyrex provides automatic retry mechanisms to handle transient failures and ensure reliable task execution.

Basic Retry Configuration

Configure retries using the maxRetries option:
import { HyrexRegistry } from '@hyrex/hyrex';

const hy = new HyrexRegistry();

const resilientTask = hy.task({
    name: 'resilientTask',
    config: {
        maxRetries: 3  // Retry up to 3 times on failure (0-10)
    },
    func: async (data) => {
        // Task that might fail transiently
        const result = await unreliableService.call(data);
        return result;
    }
});

Default Behavior

  • Default retries: 3 attempts (configurable 0-10)
  • Retry trigger: Any thrown exception
  • Success condition: Task returns without throwing
  • Backoff strategy: Exponential backoff (built-in)

Exponential Backoff

The TypeScript SDK implements exponential backoff automatically:
  • First 5 failures: No delay (offset period)
  • After offset: 1.5s, 3s, 4.5s, 6s… up to 15s max
  • Backoff resets on successful execution
// Task with retries will automatically use exponential backoff
const apiTask = hy.task({
    name: 'apiTask',
    config: {
        maxRetries: 5,
        timeoutSeconds: 30
    },
    func: async (data) => {
        // Failures will retry with increasing delays
        const response = await externalAPI.post(data);
        return response.data;
    }
});

Retry Patterns

Conditional Retries

Control which errors trigger retries:
class RetryableError extends Error {}
class PermanentError extends Error {}

const smartTask = hy.task({
    name: 'smartTask',
    config: {
        maxRetries: 3
    },
    func: async (data) => {
        try {
            return await riskyOperation(data);
        } catch (error: any) {
            if (error.code === 'RATE_LIMIT') {
                // This will retry with backoff
                throw new RetryableError('Rate limited, will retry');
            } else if (error.code === 'INVALID_INPUT') {
                // This will still retry (but probably fail again)
                // Consider returning an error instead
                throw new PermanentError('Invalid input, retry won't help');
            }
            // Unknown errors also retry
            throw error;
        }
    }
});

Return vs Throw

Control retry behavior with return values:
const flexibleTask = hy.task({
    name: 'flexibleTask',
    config: {
        maxRetries: 3
    },
    func: async (data) => {
        try {
            const result = await operation(data);
            return { success: true, result };
        } catch (error) {
            // Option 1: Throw to trigger retry
            if (isTransient(error)) {
                throw error;  // Will retry
            }
            
            // Option 2: Return error to complete task
            return { 
                success: false, 
                error: error.message,
                permanent: true  // Won't retry
            };
        }
    }
});

Using Attempt Number

Access the current attempt for custom logic:
import { getHyrexContext } from '@hyrex/hyrex';

const adaptiveTask = hy.task({
    name: 'adaptiveTask',
    config: {
        maxRetries: 3
    },
    func: async (data) => {
        const ctx = getHyrexContext();
        const attemptNumber = ctx.attemptNumber; // 0-based
        
        console.log(`Attempt ${attemptNumber + 1} of ${ctx.maxRetries + 1}`);
        
        // Use different strategies based on attempt
        if (attemptNumber === 0) {
            // First attempt - try primary service
            return await primaryService.process(data);
        } else if (attemptNumber < ctx.maxRetries) {
            // Intermediate attempts - try backup service
            return await backupService.process(data);
        } else {
            // Final attempt - use fallback
            return await fallbackProcessing(data);
        }
    }
});

Common Retry Scenarios

External API Calls

const apiCallTask = hy.task({
    name: 'apiCall',
    config: {
        maxRetries: 5,
        timeoutSeconds: 30
    },
    func: async ({ endpoint, payload }) => {
        try {
            const response = await fetch(endpoint, {
                method: 'POST',
                body: JSON.stringify(payload),
                headers: { 'Content-Type': 'application/json' }
            });
            
            if (!response.ok) {
                // Non-2xx status codes will retry
                throw new Error(`API error: ${response.status}`);
            }
            
            return await response.json();
        } catch (error) {
            // Network errors, timeouts, etc. will retry
            console.error(`API call failed: ${error.message}`);
            throw error;
        }
    }
});

Database Operations

const dbTask = hy.task({
    name: 'dbOperation',
    config: {
        maxRetries: 2,  // Fewer retries for DB operations
        queue: 'database'
    },
    func: async ({ query, params }) => {
        try {
            return await db.execute(query, params);
        } catch (error) {
            if (error.code === 'DEADLOCK_DETECTED') {
                // Deadlocks are retryable
                throw error;
            } else if (error.code === 'CONSTRAINT_VIOLATION') {
                // Don't retry constraint violations
                return { 
                    error: 'Constraint violation',
                    code: error.code,
                    failed: true
                };
            }
            throw error;
        }
    }
});

File Processing

const fileTask = hy.task({
    name: 'processFile',
    config: {
        maxRetries: 3,
        timeoutSeconds: 300  // 5 minutes
    },
    func: async ({ fileUrl }) => {
        const ctx = getHyrexContext();
        
        try {
            // Download file
            const file = await downloadFile(fileUrl);
            
            // Process file
            const result = await processFile(file);
            
            // Cleanup
            await cleanup(file);
            
            return result;
        } catch (error) {
            // Log attempt for debugging
            console.error(
                `File processing failed on attempt ${ctx.attemptNumber + 1}:`,
                error.message
            );
            
            // Always cleanup on failure
            await cleanup(file).catch(() => {});
            
            throw error;  // Retry
        }
    }
});

Best Practices

  1. Set Appropriate Retry Limits
    // Quick operations: fewer retries
    config: { maxRetries: 2 }
    
    // External services: more retries
    config: { maxRetries: 5 }
    
    // Critical operations: max retries
    config: { maxRetries: 10 }
    
  2. Consider Idempotency
    const idempotentTask = hy.task({
        name: 'chargePayment',
        config: {
            maxRetries: 3,
            idempotencyKey: 'payment-123'  // Prevents duplicate charges
        },
        func: async (data) => {
            // Safe to retry without side effects
            return await paymentGateway.charge(data);
        }
    });
    
  3. Log Retry Attempts
    const loggingTask = hy.task({
        name: 'withLogging',
        config: { maxRetries: 3 },
        func: async (data) => {
            const ctx = getHyrexContext();
            
            if (ctx.attemptNumber > 0) {
                console.log(`Retry attempt ${ctx.attemptNumber} for task ${ctx.taskId}`);
            }
            
            try {
                return await operation(data);
            } catch (error) {
                console.error(`Attempt ${ctx.attemptNumber} failed:`, error);
                throw error;
            }
        }
    });
    
  4. Handle Final Failures
    const gracefulTask = hy.task({
        name: 'gracefulFailure',
        config: { maxRetries: 3 },
        func: async (data) => {
            const ctx = getHyrexContext();
            
            try {
                return await riskyOperation(data);
            } catch (error) {
                // On final attempt, return error state
                if (ctx.attemptNumber === ctx.maxRetries) {
                    await notifyFailure(ctx.taskId, error);
                    return {
                        status: 'failed_permanently',
                        error: error.message,
                        attempts: ctx.attemptNumber + 1
                    };
                }
                throw error;  // Retry
            }
        }
    });
    

Limitations

  • Max retries: Limited to 10 attempts
  • No custom backoff: Exponential backoff is built-in and not configurable
  • All exceptions retry: Cannot configure which error types retry
  • No retry delay control: Cannot set custom delays between retries

Next Steps

Tasks

Configure tasks with retries

Error Handling

Handle errors in tasks

Queues

Route retryable tasks