What to Do If Model Output Is Garbled?

Try adjusting the Temperature value in settings, for example, setting it below 0.7.

Temperature is a parameter that controls the randomness of LLM-generated results.

  • When Temperature is low (e.g., 0.1), generated results are more deterministic and conservative, closer to the training data distribution.

  • When Temperature is high (e.g., 1.0), generated results are more random and diverse, potentially producing unexpected and innovative content.

By adjusting the Temperature value, you can control the creativity and risk level of LLM output. Lower Temperature is more suitable for tasks requiring reliability and consistency, while higher Temperature can be used to encourage creative and exploratory applications.

Last updated