0

I have an issue trying to move servers. I have a couple of folders with 300k subdirectories and wget will only copy the first 10000 (the same number that Filezilla is capable of showing).

Splitting those folders would be a very big amount of work. Any help would be greatly appreciated.

Callombert
  • 147
  • 4
  • Can't you make an archive of it all or parts of it? Note that it may be a good idea to reorganize that as this can cause an undue load on the machine in many use cases. – Julie Pelletier Jun 04 '16 at 16:32
  • @JuliePelletier Never realised having that many folders could be an issue? They all have images inside, some of them quite big. Splitting is not impossible but I'd really like to avoid doing so. – Callombert Jun 04 '16 at 16:37
  • 1
    I'm sure you can realize that looking for a book in a small drawer is much faster than looking for a book in an unsorted library. That is a strain you put on your server for **everything** that is below that directory structure. – Julie Pelletier Jun 04 '16 at 16:46
  • @JuliePelletier I'm not really familiar with the way a server proceeds to find a folder. They are profile pictures and each url is like /profile//profile.jpg I'll certainly take a look at splitting those but the problem remains for now, – Callombert Jun 04 '16 at 16:52
  • There is nothing complicated going on as my analogy shows. If the listing is that long, the computer will have to do a lot more effort to find what you ask it to open. Solving that now would also solve your current issue. The simplest solution is to split the contents based on the first characters of the user id a few levels deep. Otherwise, as I previously mentioned, do an archive of everything and you'll have a single file to transfer. – Julie Pelletier Jun 04 '16 at 16:57

0 Answers0