5

I have this website were we can create new questions. Whenever a new question is created a new url is generated I want google to crawl my website everytime a new question is added and display it in google.

I have my front end in react js and backend in express js.

My front end is hosted in firebase and backend in heroku.

Since I am using javascript and my urls are all dynamicly generated google does not crawl or index them.

Currently I am writing all dymaicly created urls into a file in my root folder in backend called sitemap.txt.

What should i do to achive this?

my sitmap link https://ask-over.herokuapp.com/sitemap.txt

my react apps link https://wixten.com

my express.js link https://ask-over.herokuapp.com

i want to add https://ask-over.herokuapp.com/sitemap.txt to google search console

vivek kn
  • 107
  • 1
  • 18
  • 45

3 Answers3

2

In fact create-react-app is the wrong tool when SEO matters. Because:

  • there is only one HTML file
  • there is no content inside the single HTML file
  • heavy first load
  • etc, [search about reasons of using nextjs a good article

SPAs are the best for PWAs, admin panels, and stuffs like this.

But take a look at https://nextjs.org/docs/migrating/from-create-react-app. And my suggestion is to make some plans to fully migrate to Next.js. Also, search about react SEO best practies and use the helpers and utilities like React Helmet.

Mohammad
  • 792
  • 1
  • 7
  • 15
1

create-react-app is not the way to go if you are going for a seo friendly website. if it's behind a login screen you can go with create-react-app. if the site is a blog or documentation site , I would suggest you migrate to nextjs or gatsby js or if it's a very small webpage go with raw html, css , js

fahad991
  • 452
  • 3
  • 8
0

It's not possible for Google or any other web crawler to crawl your SPA Websites. The best way to fix this is either to use Server Side Frameworks like Next.js or use pre-rendering and redirect crawlers to pre rendering server instead of main website.

You can checkout prerender.io, it has the open source version as well, you can run it on a seperate server and use one of the snippets/plugins for your web server (apache/nginx/others) to redirect requests to different upstream server.

I've been using it for one of my projects (e-commerce store) built on VueJs and it works like a charm.

To understand the basics, what it does is it'll load your website in a browser, and cache the rendered code in it's database/cache, and when any crawler visits your website they'll be redirected to cache which is the generated html page of your website, and crawlers will be able to read everything smoothly.

thisisayush
  • 292
  • 1
  • 8