4

I'm trying to use the stream=true property as follows.

completion = openai.Completion.create(
            model="text-davinci-003",
            prompt="Write me a story about dogs.",
            temperature=0.7,
            max_tokens=MAX_TOKENS,
            frequency_penalty=1.0,
            presence_penalty=1.0,
            stream=True,
        )

Unfortunately, I don't know what to do from here to return it to my React frontend. Typically, I've used standard response objects, setting a status and the serializer.data as the data. From my readings online, it seems I have to use the StreamingHttpResponse, but I'm not sure how to integrate that with the iterator object of completion, and actually save the outputted data once it is done streaming, as the view will end after returning the iterator to the endpoint. Any help?

rd360
  • 186
  • 8

2 Answers2

3

You can use StreamingHttpResponse. Note that you can't see it stream live for most api clients like postman, but you can see it on your terminal. If you'd like to use it for react, you'll have to use fetch api.

@api_view(["POST"])
def generate_names(request):
    if request.method == 'POST':
        # Parse the request body and extract the prompt
        prompt = request.data.get('prompt')
        
        # Set up the OpenAI API client
        openai.api_key = OPENAI_API_KEY
        
        # Define a generator function to stream the response
        def generate_response():
            for chunk in openai.ChatCompletion.create(
                model="gpt-3.5-turbo",
                messages=[{
                    "role": "user",
                    "content": prompt
                }],
                stream=True,
            ):
                content = chunk["choices"][0].get("delta", {}).get("content")
                if content is not None:
                    
                    yield content
        
        # Return a streaming response to the client
        return StreamingHttpResponse(generate_response(), content_type='text/event-stream')
    
    # Return a JSON error if the request method is not POST
    return JsonResponse({'error': 'Method not allowed.'}, status=405)

To see it in your terminal use this Curl command http://127.0.0.1:8000/api/v1/askkk --header "Content-Type: application/json" --data '{"prompt": "How do I set up payment invoices?"}'

Saeed
  • 3,294
  • 5
  • 35
  • 52
blockhead
  • 364
  • 3
  • 18
  • This line has too many closing parentheses (can't edit it b/c SO forces you to change >6 chars): `return StreamingHttpResponse(generate_response(), content_type='text/event-stream'))` – Devon May 11 '23 at 02:43
  • How does this allow you to capture the value of `content` when the entire chat is complete? – Devon May 11 '23 at 02:46
  • Hey, great answer @blockhead - would you happen to know how to do it in react, and through an AWS Lambda? – Carlo May 17 '23 at 01:19
  • @Carlo i had one hack for it, i doubt its correct way at all. I have not used lambda before. – blockhead May 18 '23 at 15:22
0

Answer ended up being I was thinking of approaching it incorrectly. Use websockets! Not StreamingHttpResponse.

rd360
  • 186
  • 8
  • As it’s currently written, your answer is unclear. Please [edit] to add additional details that will help others understand how this addresses the question asked. You can find more information on how to write good answers [in the help center](/help/how-to-answer). – devpolo Feb 19 '23 at 18:29