Samples: temperature
and top_p
#2220
garrytrinder
started this conversation in
General
Replies: 0 comments
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
-
In many samples in this repository, prompts are configured with the following:
In the Azure OpenAI On Your Data sample for example it uses a high
temperature
and lowtop_p
. Is there a specific reason behind this?https://github.com/microsoft/teams-ai/blob/main/dotnet/samples/08.datasource.azureopenai/Prompts/Chat/config.json
Beta Was this translation helpful? Give feedback.
All reactions