-
Type:
Bug
-
Resolution: Fixed
-
Affects Version/s: None
-
Component/s: translate5 AI
-
Critical
-
The default value for calculating max tokens for the LLM's is increased so reasoning models dont run into limits.
-
Emptyshow more show less