Dokumentation (english)

BART

Sequence-to-sequence model for abstractive summarization and text generation

BART (Bidirectional and Auto-Regressive Transformers) is a denoising autoencoder pre-trained for sequence-to-sequence tasks. The large-cnn variant is specifically optimized for news summarization. Can be used with base weights or a fine-tuned checkpoint.

When to use:

  • Abstractive summarization of articles or documents
  • Text transformation and paraphrasing
  • Custom seq2seq tasks after fine-tuning

Input: Input text to summarize or transform + optional fine-tuned checkpoint Output: Generated summary or transformed text, plus metadata (lengths, tokens)

Model Settings

Model Variant (default: large-cnn, options: base / large / large-cnn) Which BART checkpoint to load.

  • base: Smaller, faster, lower accuracy
  • large: Full-size model, general-purpose
  • large-cnn: Fine-tuned on CNN/DailyMail — best for news summarization

Inference Settings

Max Length (default: 142, range: 10–1024) Maximum token length of the generated output.

  • Increase for longer documents that need fuller summaries
  • Decrease for concise bullet-point style summaries

Min Length (default: 56) Minimum token length of the generated output.

  • Prevents trivially short outputs
  • Set lower if short summaries are acceptable

Num Beams (default: 4) Number of beams for beam search.

  • 1: Greedy decoding (fastest, lowest quality)
  • 4: Good balance of quality and speed
  • 8+: Higher quality, slower

Length Penalty (default: 2.0) Exponential penalty applied to sequence length during beam search.

  • Values > 1.0: Encourage longer outputs
  • Values < 1.0: Encourage shorter outputs

Early Stopping (default: true) Stop beam search when all beams have finished generating.

  • Keep true for standard use — prevents wasted computation

Command Palette

Search for a command to run...

Schnellzugriffe
STRG + KSuche
STRG + DNachtmodus / Tagmodus
STRG + LSprache ändern

Software-Details
Kompiliert vor etwa 2 Stunden
Release: v4.0.0-production
Buildnummer: master@afa25ab
Historie: 72 Items