The streamingAsync concept in Azure Open AI is quite impressive and will likely enhance the user experience in the MS Teams chatbot. Presently, all send activities occur in a single method call within the bot framework.
await dialogContext.Context.SendActivityAsync(result);
How can we implement OpenAIClient.GetCompletionsStreamingAsync concept in the bot framework, because currently only supports posting all messages at once? Most of the Azure OpenAI stream examples use a foreach loop with console output. Is there any existing functionality for this? If not, when can we expect it to be available?