2

On my site e.g. website.com, I want to place all (a few hundred) .html files into a (single) folder to better organize my site; however, I want to serve them from the root, as before.

In other words, a file's new path may be /archive/page.html but the link for it on the site would still be be /page.html. Is this doable?

jpaugh
  • 6,634
  • 4
  • 38
  • 90
Vic
  • 93
  • 1
  • 9
  • 1
    Create .htaccess file take a look at this http://stackoverflow.com/questions/19389344/how-can-i-remove-this-directory-folder-from-url-with-htaccess-rewrite – VMcreator Aug 06 '15 at 04:06
  • If you have so many files that you need to organize them into folders, why would you want your links to be *less* organized? – jpaugh Aug 06 '15 at 04:06
  • @jpaugh A bit of a unique situation, all the files are in one folder and I just don't want them to appear inside the folder in the URL. When Google indexes the pages, they'll be indexed following whatever .htaccess rules are implemented right? – Vic Aug 06 '15 at 04:12
  • Okay. That can be done by changing which folder the website is served from (to the `archive` folder, instead of the folder above it). This is usually pretty easy to do, but the answer depends on which web server you're using. Are you using Apache? NginX? Or something else? – jpaugh Aug 06 '15 at 04:13
  • @Vic what OS and webserver are you using –  Aug 06 '15 at 04:14
  • @Vic I tried to clarify your question based on your comments. Please make sure this matches what you meant. Thanks! – jpaugh Aug 06 '15 at 04:21
  • @jpaugh Honestly have no idea since it's a very simple site on a shared hosting plan and I'm just working with it through cPanel/filezilla. Netcraft is showing the site runs on Linux and an unknown web server. Not to mention, web servers are an entirely foreign concept to me tbh. – Vic Aug 06 '15 at 04:22
  • @jpaugh Yeah, that's exactly what I mean, thank you :) Will be trying the .htaccess method in a moment – Vic Aug 06 '15 at 04:24
  • Okay. If you include the name of your web-host in the question, someone might be able to help. But many web-hosts don't give you enough control to configure the web server. – jpaugh Aug 06 '15 at 04:26
  • If you're lucky, they might let you drop a `.htaccess` file in your top level directory, to configure `Apache` (if they use Apache). (GRC's post might help in that case.) – jpaugh Aug 06 '15 at 04:26
  • @Vic To answer your earlier question about Google searches, they'll see whatever links are publicly visible. You can give the Google web crawler special instructions using a [robots.txt](https://en.wikipedia.org/wiki/Robots_exclusion_standard) file, but this is unrelated to your main question. – jpaugh Aug 06 '15 at 04:39
  • @jpaugh Hm, the .htaccess method works fine for me, the only slight flaw would be that it screwed up my links(e.g. images) that involved going up a directory since it altered the URL path, which was easily remedied with find all replace all although I wish there were a better solution. Should I mark that question as "That solved my problem" then? – Vic Aug 06 '15 at 04:44
  • If it were me, I would update my question to include all of the relevant information, like the fact that you don't want to redirect *every* file (read: not image files). But, I bet that question already has an answer somewhere. – jpaugh Aug 06 '15 at 18:05
  • I think `.htaccess` files can handle this, but I don't know the best way to do it, because I don't use Apache. I'm going to update the tags to include Apache – jpaugh Aug 06 '15 at 18:06

0 Answers0