Given the following URLs:
- example.com/products
- example.com/products#/page-2
- example.com/products#/page-3
- ...
By using the robots.txt file, the first URL (example.com/products) is supposed to be indexed, every other one should be blocked from being indexed. How can this be done?
None of the following attempts work in the desired manner:
Noindex: /products#/page-*
Noindex: /products\#/page-*
Noindex: /*/page-*
Noindex: /*#/page-*
Noindex: /*\#/page-*