Summary

Summary Processor #

Generates AI-powered document summaries and insights with structured analysis.

Configuration #

ParameterTypeRequiredDefaultDescription
message_fieldstringdocumentsThe field in the pipeline context containing the documents to process
output_queueobjectnullOptional queue configuration for sending processed documents to a output queue
model_providerstringYes-ID of the LLM provider
modelstringYes-Name of the LLM model
model_context_lengthuint32Yes-Minimum context length (min: 4000 tokens)
min_input_document_lengthuint32No100Minimum bytes to process a document
max_input_document_lengthuint32No100000Maximum document size to process
ai_insights_max_lengthuint32No500Target length for AI insights (in tokens)

Example #

- document_summarization:
  model_provider: openai
  model: gpt-4o-mini
  model_context_length: 4000
  min_input_document_length: 100
  max_input_document_length: 100000
  ai_insights_max_length: 500
Edit Edit this page