Dokumentation (english)

AI Models Inference

All concepts on using AI Models

Inference means using an AI model to get results.

We feed new data into the AI model. The model produces a result

During inference the model does not learn and the internal mathematical model is fixed.

Inference is the phase where we benefit from all the work done during training. The model is now used to:

  1. make predictions or
  2. classify data or
  3. generate text or images

Some models are already trained and ready to use.

📚 Running Inference with AI Models

Ready to run inference? Check out our comprehensive AI Model Inference Guide with detailed settings documentation for all available models — text generation, embeddings, OCR, audio, and multimodal.


Command Palette

Search for a command to run...

Schnellzugriffe
STRG + KSuche
STRG + DNachtmodus / Tagmodus
STRG + LSprache ändern

Software-Details
Kompiliert vor etwa 17 Stunden
Release: v4.0.0-production
Buildnummer: master@0fe3401
Historie: 51 Items