7

I am working on my first full stack javascript application using specifically mean.js as my start point, and I have started to become nervous and somewhat confused around the issue of search engine optimization (SEO).

Has Googles recent efforts (within last year or so), to improve javascript crawling rendered this a non issue, or is this something I need to take account of in the planning and structuring of my project?

If Google can crawl AngularJS/Ajax heavy applications now, why are we getting blog posts about solutions to the SEO issue: http://blog.meanjs.org/post/78474995741/mean-seo

  • Is this type of solution necessary.
  • Is this going to be as effective as server side rendering in terms of SEO.
  • Are hash bang (#!) urls a necessary evil or just plain evil.

I know questions about SEO and AngularJS have been asked before, but there seems to be so many differing opinions on this issue that I am lost, and it would be nice to have some thoughts that are more mean.js specific. My major worries are:

  • If building a angularjs heavy implementation will make it an SEO black hole.
  • If I will end up building practically the whole project again in static files just for SEO
  • If I need to be looking at a server side rendering solution.
Finglish
  • 9,692
  • 14
  • 70
  • 114

3 Answers3

5

If you are rendering the majority of your content using a JavaScript, then yes, it becomes a search engine black hole. That's one of the big downsides of a thick client application. If you need high visibility by search engines, it's a challenge. There is a middle ground.

You'll need a combination of server side rendering and client side rendering. When the page first loads, it should have all the visible content the user needs, or at least the content that appears "above the fold" (at the top of the page). Links should be descriptive and allow search engines to dive deeper into the site. Your site's main menu should be delivered with the web page as well giving search engines something to bite into.

Content below the fold, or paginated content can be pulled in dynamically and rendered on the client using any JavaScript framework. This gives you a good mix of server side rendering to feed search engines, and the performance boost that pulling content in dynamically can offer.

Greg Burghardt
  • 17,900
  • 9
  • 49
  • 92
  • 1
    More specifically, render all SEO and feed relevant content into the page with ie. php. As for hashbanging, I'd go with history.pushstate instead. – ostrgard Nov 26 '14 at 13:48
  • Thanks, as I am still early enough in my development to do so, I decided to change things around to use more server side rendering for initially visible content as you suggested. I am still working on the details but initial tests look good. I accepted this answer but also voted up other useful comments as there was several helpful tips in other answers. – Finglish Dec 01 '14 at 23:42
1

well you'll need to be worried about the public face of your site you shouldn't be considered anything behind a log-in screen, to me the snapshot with a headless browser approach using the farment_scape seems is the way to go, it's the one that will consume less time and as you looked at mean-seo isn't that hard to implement.

take a look to this question, there are some answers about how to creates links on the pages to be SEO friendly, almost all the recent posts fit together one to each other.

https://support.google.com/webmasters/answer/174992?hl=en

and also try to register to https://webmasters.stackexchange.com/ you'll find more about seo

Community
  • 1
  • 1
pedrommuller
  • 15,741
  • 10
  • 76
  • 126
0

Just wanted to mention this npm package https://www.npmjs.com/package/mean-seo which uses PhantomJS to render a preview of your app and caches it on the disk/redis for whatever period you set.

David Avikasis
  • 516
  • 1
  • 5
  • 16