Sim

Schéma YAML des blocs de flux de travail

Référence de configuration YAML pour les blocs de flux de travail

Définition du schéma

type: object
required:
  - type
  - name
  - inputs
properties:
  type:
    type: string
    enum: [workflow]
    description: Block type identifier
  name:
    type: string
    description: Display name for this workflow block
  inputs:
    type: object
    required:
      - workflowId
    properties:
      workflowId:
        type: string
        description: ID of the workflow to execute
      inputMapping:
        type: object
        description: Map current workflow data to sub-workflow inputs
        additionalProperties:
          type: string
          description: Input value or reference to parent workflow data
      environmentVariables:
        type: object
        description: Environment variables to pass to sub-workflow
        additionalProperties:
          type: string
          description: Environment variable value
      timeout:
        type: number
        description: Maximum execution time in milliseconds
        default: 300000
        minimum: 1000
        maximum: 1800000
  connections:
    type: object
    properties:
      success:
        type: string
        description: Target block ID for successful workflow completion
      error:
        type: string
        description: Target block ID for error handling

Configuration des connexions

Les connexions dĂ©finissent oĂč le flux de travail se dirige en fonction des rĂ©sultats des sous-flux de travail :

connections:
  success: <string>                     # Target block ID for successful completion
  error: <string>                       # Target block ID for error handling (optional)

Exemples

Exécution simple d'un flux de travail

data-processor:
  type: workflow
  name: "Data Processing Workflow"
  inputs:
    workflowId: "data-processing-v2"
    inputMapping:
      rawData: <start.input>
      userId: <user-validator.userId>
    environmentVariables:
      PROCESSING_MODE: "production"
      LOG_LEVEL: "info"
  connections:
    success: process-results
    error: workflow-error-handler

Pipeline de génération de contenu

content-generator:
  type: workflow
  name: "Content Generation Pipeline"
  inputs:
    workflowId: "content-generation-v3"
    inputMapping:
      topic: <start.topic>
      style: <style-analyzer.recommendedStyle>
      targetAudience: <audience-detector.audience>
      brandGuidelines: <brand-config.guidelines>
    environmentVariables:
      CONTENT_API_KEY: "{{CONTENT_API_KEY}}"
      QUALITY_THRESHOLD: "high"
    timeout: 120000
  connections:
    success: review-content
    error: content-generation-failed

Flux de travail d'analyse en plusieurs étapes

analysis-workflow:
  type: workflow
  name: "Analysis Workflow"
  inputs:
    workflowId: "comprehensive-analysis"
    inputMapping:
      document: <document-processor.content>
      analysisType: "comprehensive"
      includeMetrics: true
      outputFormat: "structured"
    environmentVariables:
      ANALYSIS_MODEL: "gpt-4o"
      OPENAI_API_KEY: "{{OPENAI_API_KEY}}"
      CLAUDE_API_KEY: "{{CLAUDE_API_KEY}}"
  connections:
    success: compile-analysis-report
    error: analysis-workflow-error

Exécution conditionnelle de flux de travail

customer-workflow-router:
  type: condition
  name: "Customer Workflow Router"
  inputs:
    conditions:
      if: <customer-type.type> === "enterprise"
      else-if: <customer-type.type> === "premium"
      else: true
  connections:
    conditions:
      if: enterprise-workflow
      else-if: premium-workflow  
      else: standard-workflow

enterprise-workflow:
  type: workflow
  name: "Enterprise Customer Workflow"
  inputs:
    workflowId: "enterprise-customer-processing"
    inputMapping:
      customerData: <customer-data.profile>
      accountManager: <account-assignment.manager>
      tier: "enterprise"
    environmentVariables:
      PRIORITY_LEVEL: "high"
      SLA_REQUIREMENTS: "strict"
  connections:
    success: enterprise-complete

premium-workflow:
  type: workflow
  name: "Premium Customer Workflow"
  inputs:
    workflowId: "premium-customer-processing"
    inputMapping:
      customerData: <customer-data.profile>
      supportLevel: "premium"
    environmentVariables:
      PRIORITY_LEVEL: "medium"
  connections:
    success: premium-complete

standard-workflow:
  type: workflow
  name: "Standard Customer Workflow"
  inputs:
    workflowId: "standard-customer-processing"
    inputMapping:
      customerData: <customer-data.profile>
    environmentVariables:
      PRIORITY_LEVEL: "standard"
  connections:
    success: standard-complete

Exécution parallÚle de flux de travail

parallel-workflows:
  type: parallel
  name: "Parallel Workflow Processing"
  inputs:
    parallelType: collection
    collection: |
      [
        {"workflowId": "sentiment-analysis", "focus": "sentiment"},
        {"workflowId": "topic-extraction", "focus": "topics"},
        {"workflowId": "entity-recognition", "focus": "entities"}
      ]
  connections:
    success: merge-workflow-results

execute-analysis-workflow:
  type: workflow
  name: "Execute Analysis Workflow"
  parentId: parallel-workflows
  inputs:
    workflowId: <parallel.currentItem.workflowId>
    inputMapping:
      content: <start.content>
      analysisType: <parallel.currentItem.focus>
    environmentVariables:
      ANALYSIS_API_KEY: "{{ANALYSIS_API_KEY}}"
  connections:
    success: workflow-complete

Flux de travail de gestion d'erreurs

main-workflow:
  type: workflow
  name: "Main Processing Workflow"
  inputs:
    workflowId: "main-processing-v1"
    inputMapping:
      data: <start.input>
    timeout: 180000
  connections:
    success: main-complete
    error: error-recovery-workflow

error-recovery-workflow:
  type: workflow
  name: "Error Recovery Workflow"
  inputs:
    workflowId: "error-recovery-v1"
    inputMapping:
      originalInput: <start.input>
      errorDetails: <main-workflow.error>
      failureTimestamp: "{{new Date().toISOString()}}"
    environmentVariables:
      RECOVERY_MODE: "automatic"
      FALLBACK_ENABLED: "true"
  connections:
    success: recovery-complete
    error: manual-intervention-required

Mappage des entrées

Mappez les données du flux de travail parent vers le sous-flux de travail :

inputMapping:
  # Static values
  mode: "production"
  version: "1.0"
  
  # References to parent workflow data
  userData: <user-processor.profile>
  settings: <config-loader.settings>
  
  # Complex object mapping
  requestData:
    id: <start.requestId>
    timestamp: "{{new Date().toISOString()}}"
    source: "parent-workflow"

Références des sorties

Une fois qu'un bloc de flux de travail est terminé, vous pouvez référencer ses sorties :

# In subsequent blocks
next-block:
  inputs:
    workflowResult: <workflow-name.output>    # Sub-workflow output
    executionTime: <workflow-name.duration>  # Execution duration
    status: <workflow-name.status>           # Execution status

Bonnes pratiques

  • Utilisez des identifiants de workflow descriptifs pour plus de clartĂ©
  • Ne mappez que les donnĂ©es nĂ©cessaires aux sous-workflows
  • DĂ©finissez des dĂ©lais d'expiration appropriĂ©s selon la complexitĂ© du workflow
  • Incluez une gestion des erreurs pour une exĂ©cution robuste
  • Transmettez les variables d'environnement de maniĂšre sĂ©curisĂ©e
  • Testez d'abord les sous-workflows indĂ©pendamment
  • Surveillez les performances des workflows imbriquĂ©s
  • Utilisez des identifiants de workflow versionnĂ©s pour la stabilitĂ©
Schéma YAML des blocs de flux de travail