I recentley completed a project whereby I moved a pretty large website to a nice clean smaller website. What this meant was I had to account for a large number of pages that had been indexed by search engines and redirect them appropriately to a relevant page on the new website using 301 Redirects
.
There were over 300 unique URLs all of which I had to redirect. I used the apache mod_rewrite
module to achieve the redirects as most of the URLs contained GET parameters
which your standard Redirect
statement doesn't read.
So for example I had URLs like this:
https://olddomain/product.html?prod_id=143&id=234
and a redirect like so:
RewriteCond %{QUERY_STRING} prod_id=143&id=234$
RewriteRule ^product\.html$ /products/product-a? [L,R=301]
This was the case for pretty much all of them. I guess my question is was I correct to write down 300 individual redirects like the above or is there a "cleaner" way to do this? I heard having to many rewrite rules
can slow a server down.
I'd appreciate your input on this guys and if this isn't the right place to ask apologies, I'll delete and post on the relevant exchange.