Skip to content

Comments

Correct the sampling params#67

Merged
ykhrustalev merged 7 commits intomainfrom
ykhrustalev/sampling
Feb 24, 2026
Merged

Correct the sampling params#67
ykhrustalev merged 7 commits intomainfrom
ykhrustalev/sampling

Conversation

@ykhrustalev
Copy link
Contributor

@ykhrustalev ykhrustalev commented Feb 24, 2026

Problem

Inference examples across the docs used incorrect or missing sampling parameters (e.g., temperature=0.7), leading users to suboptimal generation quality.

Upstream HF Model Card Recommended Settings

| Model Family               | temp | top_k | top_p | min_p | repeat_penalty |
|----------------------------|------|-------|-------|-------|----------------|
| LFM2.5-1.2B-Instruct       | 0.1  | 50    | —     | —     | 1.05           |
| LFM2.5-1.2B-Thinking       | 0.1  | 50    | 0.1   | —     | 1.05           |
| LFM2 text + LFM2.5-JP      | 0.3  | —     | —     | 0.15  | 1.05           |
| All VL models              | 0.1  | —     | —     | 0.15  | 1.05           |

Changes

  • Snippet system — Added samplingFlags/samplingParams props to all 6 snippet components so each model page passes its correct params
  • 17 model pages — Added per-model sampling props to TextTransformers, TextLlamacpp, TextVllm, VlLlamacpp
  • 6 deployment guides — Fixed sampling params in llama.cpp, Transformers, vLLM, Ollama, LM Studio, MLX; used extra_body for non-standard OpenAI client params; fixed
    engine-specific field names (repeat_penalty for Ollama/LM Studio, repetition_penalty for vLLM/llama-server/MLX)
  • 2 notebooks — Fixed LFM2_Inference_with_llama_cpp.ipynb and quickstart_snippets.ipynb with correct params

@ykhrustalev ykhrustalev merged commit b60719b into main Feb 24, 2026
7 checks passed
@ykhrustalev ykhrustalev deleted the ykhrustalev/sampling branch February 24, 2026 15:42
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

1 participant