5

I'm trying to build a ChatGPT website clone and now I need to make the stream completion effect that shows the result word-per-word. My server is a TypeScript Node.js app that uses the Express.js framework.

Here's the route:

import express, { Request, Response } from 'express';
import cors from 'cors';
import { Configuration, OpenAIAPI } from 'openai';

// ...

app.post('/api/admin/testStream', async (req: Request, res: Response) => {
    const { password } = req.body;

    try {
        if (password !== process.env.ADMIN_PASSWORD) {
            res.send({ message: 'Incorrect password' });
            return;
        }
        const completion = await openai.createCompletion({
            model: 'text-davinci-003',
            prompt: 'Say this is a test',
            stream: true,
        }, { responseType: 'stream' });

        completion.data.on('data', (chunk: any) => {
            console.log(chunk.toString());
        });

        res.send({ message: 'Stream started' });
    } catch (err) {
        console.log(err);
        res.send(err);
    }
});

// ...

Right now, it gives me an error saying

Property 'on' does not exist on type 'CreateCompletionResponse'.ts(2339)

even if I set the { responseType: 'stream' }.

How can I solve this problem and send the response chunk-per-chunk to the frontend? (I'm using Socket.IO.)

Alexxino
  • 557
  • 2
  • 16
  • Instead of `completion.data.on('data', ...);`, you might need to do `completion.on('data', ...);`. – uzluisf Apr 29 '23 at 20:38
  • @uzluisf already tried that, it doesn't work either (same error) – Alexxino Apr 29 '23 at 20:40
  • You're right. I looked at the package's NPM page and it says "Streaming completions (stream=true) are not natively supported in this package yet, but a workaround exists if needed." Did you try it, although it seems quite similar to what you've already? Link to workaround: https://github.com/openai/openai-node/issues/18#issuecomment-1369996933 – uzluisf Apr 29 '23 at 20:50
  • Glad you got it working :)! – uzluisf Apr 30 '23 at 23:37

1 Answers1

13

Finally solved it thanks to the help of @uzluisf ! Here's what I did:

import express, { Request, Response } from 'express';
import cors from 'cors';
import { Configuration, OpenAIAPI } from 'openai';
import http, { IncomingMessage } from 'http';

// ...

app.post('/api/admin/testStream', async (req: Request, res: Response) => {
    const { password } = req.body;

    try {
        if (password !== process.env.ADMIN_PASSWORD) {
            res.send({ message: 'Incorrect password' });
            return;
        }

        const completion = await openai.createChatCompletion({
            model: 'gpt-3.5-turbo',
            messages: [{ role: 'user', content: 'When was America founded?' }],
            stream: true,
        }, { responseType: 'stream' });

        const stream = completion.data as unknown as IncomingMessage;

        stream.on('data', (chunk: Buffer) => {
            const payloads = chunk.toString().split("\n\n");
            for (const payload of payloads) {
                if (payload.includes('[DONE]')) return;
                if (payload.startsWith("data:")) {
                    const data = JSON.parse(payload.replace("data: ", ""));
                    try {
                        const chunk: undefined | string = data.choices[0].delta?.content;
                        if (chunk) {
                            console.log(chunk);
                        }
                    } catch (error) {
                        console.log(`Error with JSON.parse and ${payload}.\n${error}`);
                    }
                }
            }
        });

        stream.on('end', () => {
            setTimeout(() => {
                console.log('\nStream done');
                res.send({ message: 'Stream done' });
            }, 10);
        });

        stream.on('error', (err: Error) => {
            console.log(err);
            res.send(err);
        });
    } catch (err) {
        console.log(err);
        res.send(err);
    }
});

// ...

For more info, visit https://github.com/openai/openai-node/issues/18

Also managed to send chunks of message using Socket.IO events!


BTW, if anyone needs to see more of this app, you can check this links:

Alexxino
  • 557
  • 2
  • 16