0

I recentley completed a project whereby I moved a pretty large website to a nice clean smaller website. What this meant was I had to account for a large number of pages that had been indexed by search engines and redirect them appropriately to a relevant page on the new website using 301 Redirects.

There were over 300 unique URLs all of which I had to redirect. I used the apache mod_rewrite module to achieve the redirects as most of the URLs contained GET parameters which your standard Redirect statement doesn't read.

So for example I had URLs like this:

https://olddomain/product.html?prod_id=143&id=234

and a redirect like so:

RewriteCond %{QUERY_STRING} prod_id=143&id=234$
RewriteRule ^product\.html$ /products/product-a? [L,R=301]

This was the case for pretty much all of them. I guess my question is was I correct to write down 300 individual redirects like the above or is there a "cleaner" way to do this? I heard having to many rewrite rules can slow a server down.

I'd appreciate your input on this guys and if this isn't the right place to ask apologies, I'll delete and post on the relevant exchange.

Javacadabra
  • 101
  • 6

1 Answers1

2

I would implement these kind of redirects in the application code itself, creating an array with source and destination URLs, and then testing if the URL contains params in the array, send back a redirect to destination.

Using rewrite rules for this is a bit complicated way to implement this.

Tero Kilkanen
  • 36,796
  • 3
  • 41
  • 63
  • Thank you for your answer. Are there any other disadvantages to using rewrite rules bar the complexity? I'm concerned about performance issues with lots of rewrite rules. – Javacadabra Jun 21 '16 at 11:52
  • 1
    I do not know exactly how these rewrite rules are implemented inside Apache. Depending on the implementation, there can be performance issues. However, with the processing capacity of current servers, I don't think that is going to be relevant. – Tero Kilkanen Jun 21 '16 at 12:18