Skip to main content

softmax_temperature

Controls prediction sharpness (classification only):
  • Lower values (e.g., 0.7): sharper, more confident predictions — useful when accuracy is already high
  • Higher values (e.g., 1.2): softer, more calibrated predictions — useful when probability calibration matters
model = TabPFNClassifier(softmax_temperature=0.8)
If you use tuning_config={"calibrate_temperature": True}, the temperature is tuned automatically and overrides this value.

Metric Tuning

For metrics that are sensitive to decision thresholds (F1, balanced accuracy, precision, recall), use the built-in metric tuning:
model = TabPFNClassifier(
    eval_metric="f1",
    tuning_config={
        "calibrate_temperature": True,
        "tune_decision_thresholds": True,
    },
)

Handling Imbalanced Data

  • Set balance_probabilities=True as a quick heuristic for imbalanced datasets
  • For more control, use eval_metric="balanced_accuracy" with threshold tuning
balance_probabilities does not always help. In some cases it can balance predictions at the cost of overall predictive power. Test both settings.