10

I'm using react-boilerplate (with react-router, sagas, express.js) for my React app and on top of it I've added SSR logic so that once it receives an HTTP request it renders react components to string based on URL and sends HTML string back to the client.

While react rendering is happening on the server side, it also makes fetch request through sagas to some APIs (up to 5 endpoints based on the URL) to get data for components before it actually renders the component to string.

Everything is working great if I make only several request to the Node server at the same time, but once I simulate load of 100+ concurrent requests and it starts processing it then at some point it crashes with no indication of any exception.

What I've noticed while I was trying to debug the app is that once 100+ incoming requests begin to be processed by the Node server it sends requests to APIs at the same time but receives no actual response until it stops stacking those requests.

The code that's used for rendering on the server side:

async function renderHtmlDocument({ store, renderProps, sagasDone, assets, webpackDllNames }) {
  // 1st render phase - triggers the sagas
  renderAppToString(store, renderProps);

  // send signal to sagas that we're done
  store.dispatch(END);

  // wait for all tasks to finish
  await sagasDone();

  // capture the state after the first render
  const state = store.getState().toJS();

  // prepare style sheet to collect generated css
  const styleSheet = new ServerStyleSheet();

  // 2nd render phase - the sagas triggered in the first phase are resolved by now
  const appMarkup = renderAppToString(store, renderProps, styleSheet);

  // capture the generated css
  const css = styleSheet.getStyleElement();

  const doc = renderToStaticMarkup(
    <HtmlDocument
      appMarkup={appMarkup}
      lang={state.language.locale}
      state={state}
      head={Helmet.rewind()}
      assets={assets}
      css={css}
      webpackDllNames={webpackDllNames}
    />
  );
  return `<!DOCTYPE html>\n${doc}`;
}

// The code that's executed by express.js for each request
function renderAppToStringAtLocation(url, { webpackDllNames = [], assets, lang }, callback) {
  const memHistory = createMemoryHistory(url);
  const store = createStore({}, memHistory);

  syncHistoryWithStore(memHistory, store);

  const routes = createRoutes(store);

  const sagasDone = monitorSagas(store);

  store.dispatch(changeLocale(lang));
  
  match({ routes, location: url }, (error, redirectLocation, renderProps) => {
    if (error) {
      callback({ error });
    } else if (renderProps) {
      renderHtmlDocument({ store, renderProps, sagasDone, assets, webpackDllNames })
        .then((html) => {
          callback({ html });
        })
        .catch((e) => callback({ error: e }));
    } else {
      callback({ error: new Error('Unknown error') });
    }
  });
}

So my assumption is that something is going wrong once it receives too many HTTP requests which in turn generates even more requests to API endpoints to render react components.

I've noticed that it blocks event loop for 300ms after renderAppToString() for every client request, so once there are 100 concurrent requests it blocks it for about 10 seconds. I'm not sure if that's a normal or bad thing though.

Is it worth trying to limit simultaneous requests to Node server?

I couldn't find much information on the topic of SSR + Node crashes. So I'd appreciate any suggestions as to where to look at to identify the problem or for possible solutions if anyone has experienced similar issue in the past.

George Borunov
  • 1,562
  • 1
  • 15
  • 14
  • 1
    Why not use ReactDOM.hydrate in the client's main js file and user handlebars to return the file as response. I will write an answer below just to be clear – Abhay Shiro Feb 11 '18 at 13:13
  • 1
    Try using PM2 cluster to handle multiple concurrent request. http://pm2.keymetrics.io/docs/usage/cluster-mode/ – cauchy Feb 14 '18 at 09:06
  • This is the exact same problem we are dealing with right now. Even with PM2 in cluster mode running 7 servers it's the same issue. Did you by any chance manage to fix it? – Hirad Roshandel Feb 26 '20 at 22:59

3 Answers3

3

1. Run express in a cluster

A single instance of Node.js runs in a single thread. To take advantage of multi-core systems, the user will sometimes want to launch a cluster of Node.js processes to handle the load.

As Node is single threaded the problem may also be in a file lower down the stack were you are initialising express.

There are a number of best practices when running a node app that are not generally mentioned in react threads.

A simple solution to improve performance on a server running multiple cores is to use the built in node cluster module

https://nodejs.org/api/cluster.html

This will start multiple instance of your app on each core of your server giving you a significant performance improvement (if you have a multicore server) for concurrent requests

See for more information on express performance https://expressjs.com/en/advanced/best-practice-performance.html

You may also want to throttle you incoming connections as when the thread starts context switching response times drop rapidly this can be done by adding something like NGINX / HA Proxy in front of your application

2. Wait for the store to be hydrated before calling render to string

You don't want to have to render you layout until your store has finished updating as other comments note this is a blocks the thread while rendering.

Below is the example taken from the saga repo which shows how to run the sagas with out the need to render the template until they have all resolved

  store.runSaga(rootSaga).done.then(() => {
    console.log('sagas complete')
    res.status(200).send(
      layout(
        renderToString(rootComp),
        JSON.stringify(store.getState())
      )
    )
  }).catch((e) => {
    console.log(e.message)
    res.status(500).send(e.message)
  })

https://github.com/redux-saga/redux-saga/blob/master/examples/real-world/server.js

3. Make sure node environment is set correctly

Also ensure you are correctly using NODE_ENV=production when bundling / running your code as both express and react optimise for this

user1095118
  • 4,338
  • 3
  • 26
  • 22
  • 1
    As far as I know this allows Node app to process requests in parallel on multi-core systems. So let's say if I have 2 cores, then I'll get x2 performance compared to the current approach. I don't think that they's going to resolve the problem entirely since the current performance is pretty low (up to 100 concurrent requests). – George Borunov Feb 12 '18 at 19:41
  • 1
    No it wont but 100% performance increase is good either way :). Updating answer to address other potential issues – user1095118 Feb 14 '18 at 10:32
  • 1
    What is the TTFB for a single request ? if it is large this could indicate a slow endpoint clogging things up – user1095118 Feb 14 '18 at 10:54
3

enter image description here

In the above image, I am doing ReactDOM.hydrate(...) I can also load my initial and required state and send it down in hydrate.

enter image description here

I have written the middleware file and I am using this file to decide based on what URL i should send which file in response.

enter image description here

Above is my middleware file, I have created the HTML string of the whichever file was requested based on URL. Then I add this HTML string and return it using res.render of express.

enter image description here

Above image is where I compare the requested URL path with the dictionary of path-file associations. Once it is found (i.e. URL matches) I use ReactDOMserver render to string to convert it into HTML. This html can be used to send with handle bar file using res.render as discussed above.

This way I have managed to do SSR on my most web apps built using MERN.io stack.

Hope my answer helped you and Please write comment for discussions

Abhay Shiro
  • 3,431
  • 2
  • 16
  • 26
2

The calls to renderToString() are synchronous, so they are blocking the thread while they are running. So its no surprise that when you have 100+ concurrent requests that you have an extremely blocked up queue hanging for ~10 seconds.

Edit: It was pointed out that React v16 natively supports streaming, but you need to use the renderToNodeStream() method for streaming the HTML to the client. It should return the exact same string as renderToString() but streams it instead, so you don't have to wait for the full HTML to be rendered before you start sending data to the client.

mootrichard
  • 3,581
  • 13
  • 25
  • 2
    Since react v16 `renderToString` supports streaming, no need to use external libs. https://reactjs.org/blog/2017/09/26/react-v16.0.html#better-server-side-rendering – zarcode Feb 10 '18 at 08:47
  • 1
    Thank you for the update there I'll amend the answer. It actually looks like that method _doesn't_ support streaming but rather `renderToNodeStream()` is the necessary method for streaming the HTML to the client. – mootrichard Feb 12 '18 at 17:25
  • 1
    How do I ensure that all requests have returned some data through sagas before I render the HTML? Even if I make first renderToString asynchronous it's still going to wait until all sagas have been resolved and then will do renderToString second time to populate it with data in store. – George Borunov Feb 12 '18 at 19:35
  • 1
    You need to make sure that your sagas don't return until their requests have been resolved then. You're already using `await` on `sagasDone()` to ensure that markup wouldn't occur until that function resolves, so you just need to be sure that the saga _also_ doesn't resolve until their requests have completed. You need to remember that its not the HTTP requests (which are likely asynchronous anyways). Its the call to `renderToStaticMarkup()` that is causing the process to be blocked and hang while waiting for the requests to resolve. – mootrichard Feb 12 '18 at 21:18
  • 1
    as a note renderToNodeStream does not work with react Helemt which is referenced in the OP – user1095118 Feb 14 '18 at 10:56