13

I have a React based SPA that is hosted via S3 on one subdomain, react.mydomain.com ... It communicates with a PHP REST API that is hosted on a VPS on another subdomain, api.mydomain.com . The api.mydomain.com is behind CloudFlare. The webapp is behind CloudFront since it is on AWS.

I am having issues w/ bot requests directly to the API flooding my VPS, and I would like to use the JS challenge functionality with CloudFlare to mitigate.

However, what seems to be happening is that users are able to load the React webapp (which is not behind CloudFlare). Then, the request that will prompt the JS challenge will fail with a 503 response instantly, because it is an AJAX request and it is incompatible with the Javascript challenge.

I thought I may be able to handle this by catching the error and redirecting. However, if I manually force my own browser to navigate to the api.mydomain.com URL, I will see the CloudFlare challenge and pass it. However, if I then navigate back to my react.mydomain.com SPA, the OPTIONS requests will fail because it cannot attach the cookie that tells CloudFlare it has passed.

I don't understand how to adjust my infrastructure so that I can take advantage of using the JS challenge. At the moment I am restricted to using rate limiting, but I have found that I am still letting in what seems like ~75% or more of the unwanted bot traffic through by the time I get severe enough that users start complaining.

  • I don't know about cloudflare, but I've seen other solutions configure the bot protection tracking cookie to include the scope of both frontend and API domains. And to take care of the main problem, you need to have your JS code detect the challenge, execute it (e.g. with `eval()`), wait for the challenge to finish, and rerun the request. – root Mar 04 '21 at 23:07

3 Answers3

-1

If you have backend access, you may be able to use NPM and process.kill(process.pid) upon detection of a bot as a temporary solution.

9pfs
  • 560
  • 5
  • 17
  • It (coincidentally) creates a [website] closed the connection error which is extremely hard for a _human_ to bypass, and therefore would block headless bots efficiently. – 9pfs Mar 11 '21 at 03:34
-1

It seems like you're facing a complex issue with your current infrastructure setup. Let's break it down and discuss possible solutions.

  1. AJAX requests failing with a 503 response: The JavaScript challenge functionality in CloudFlare is incompatible with AJAX requests, resulting in a 503 response. To overcome this, you can modify your infrastructure to handle the challenge differently for the API requests.

    One option is to create a separate subdomain, such as protectedapi.mydomain.com, and configure it to use CloudFlare with the JavaScript challenge enabled. This subdomain would act as a proxy for your API requests. Your React web app can then send requests to protectedapi.mydomain.com, which will handle the JavaScript challenge and forward the request to api.mydomain.com. This way, the API requests go through the challenge while keeping your web app unaffected.

  2. OPTIONS requests failing after passing the challenge: After successfully passing the JavaScript challenge on api.mydomain.com, your subsequent OPTIONS requests fail because they don't include the necessary cookie indicating the successful challenge.

    To resolve this, you can configure CloudFlare to issue a session cookie when the JavaScript challenge is passed. By enabling the "Cookie" option in the CloudFlare dashboard under "Firewall > Settings > Security Level," a cookie will be issued on successful completion of the JavaScript challenge. This cookie will be automatically attached to subsequent requests, allowing OPTIONS requests to pass through successfully.

    Additionally, ensure that your CORS (Cross-Origin Resource Sharing) configuration allows the necessary headers and cookies to be included in the OPTIONS requests.

  3. Rate limiting and unwanted bot traffic: While rate limiting can help mitigate unwanted bot traffic, it seems that it's not providing sufficient protection in your case. Here are a few additional measures you can consider:

    • Implement a combination of rate limiting and additional security mechanisms like CAPTCHA challenges or user-agent analysis to distinguish between bots and legitimate users more accurately.
    • Utilize CloudFlare's WAF (Web Application Firewall) to set up specific rules and filters to detect and block malicious traffic.
    • Evaluate your rate limiting strategy and adjust the thresholds or patterns based on the characteristics of the unwanted bot traffic you're experiencing.
    • Consider implementing a more sophisticated bot detection system using machine learning techniques or third-party services specializing in bot mitigation.

Remember to monitor your system and adjust the security measures as needed to strike a balance between preventing unwanted bot traffic and maintaining a smooth user experience.

MilanDulals
  • 113
  • 4
  • 1
    Yet another copy-paste from ChatGPT – DavidW Jun 21 '23 at 21:47
  • This answer looks like it was generated by an AI (like ChatGPT), not by an actual human being. You should be aware that [posting AI-generated output is officially **BANNED** on Stack Overflow](https://meta.stackoverflow.com/q/421831). If this answer was indeed generated by an AI, then I strongly suggest you delete it before you get yourself into even bigger trouble: **WE TAKE PLAGIARISM SERIOUSLY HERE.** Please read: [Why posting GPT and ChatGPT generated answers is not currently acceptable](https://stackoverflow.com/help/gpt-policy). – tchrist Jul 03 '23 at 21:37
-2

I suggest not to host spa in s3 only file uploads or attachments to s3

Host on Ec2 and block all access through security Group Policy only allow Cloudflare IP's which are listed here https://www.cloudflare.com/ips/

You can also use Amazon AWS Lambda serverless for hosting instead of s3 https://aws.amazon.com/lambda/?c=ser&sec=srv

  • 1
    S3, especially with CloudFront in front of it, is a perfectly suitable and scalable solution to host a single page application. Hosting on EC2 or using a Lambda is not enough. A single page application on EC2 will still have to be served with either a Node.js service or an Apache server, for example. A Lambda will have to be integrated to API Gateway in order to serve the content. Both of these solutions seem clunky for a SPA whereas S3 offers a straightforward solution. It looks like the problem that the user is encountering is not related to the AWS service used for the app. – Yves Gurcan Apr 28 '21 at 03:32