0

I wanted to download a folder and its contents from a given url. my requirement is downloading a project from one repository. It is not mandatory that the repository will be svn or git. So I am planning to do folder copy from the given URL. Please suggest which is the best option. I cannot depend on SVN repository. It could be plain http url. Please suggest any open source free utilities are available or not. What about the option of Ant? Does it support downloading a folder from any url. I know it supports svn. But what about normal http url?

2 Answers2

0

To copy files from a server, you can use the command line tools wget or curl.

Using wget: How to mirror only a section of a website?

For cURL, you have to use a script: http://curl.haxx.se/programs/curlmirror.txt

Community
  • 1
  • 1
Aaron Digulla
  • 321,842
  • 108
  • 597
  • 820
0

With plain http, you will have the problem that you cannot list all entries in a folder, and thus will have trouble to recursively download a directory from a remote server. That's why you need something with an index, but it doesn't need to be svn or git or something like that. An example of how you can use ftp in Java can be found here

kutschkem
  • 7,826
  • 3
  • 21
  • 56