1

My employer has been misusing Bintray as our binary repository for some time. We are finally moving to Artifactory instead and closing down Bintray. But this seems to be an almost impossible task. There is no way of exporting Bintray repos to a zip. Downloading the repos means manually downloading each file from the UI or through their API. I have tried two approaches for automation:

1) wget for crawling our bintray like this: wget -e robots=off -o ~/wget.log -w 1 -m -np --user --password "https://.bintray.com" which yielded all of the files in the repos. But this only solves half the problem. I couldn't find out how to import the files to a repository in artifactory (all the repos are over 100mbs each and therefore can't be uploaded, for some reason).

2) I set the Bintray repos up as remote repositories and enabled Active Replication. That seems to have worked for now. But I don't know if they will be removed when the Bintray account is moved or even if they are stored in Artifactory. Therefore I would like to convert the remote repo to a local repo, to make sure that it is permanently stored in artifactory is there a way of doing this? If so, how?

kschnack
  • 362
  • 4
  • 12

1 Answers1

3

I'll try to address both of your questions below.

  1. What do you mean you can't upload more than 100mb? Which version of Artifactory are you using? On-prem or SaaS-based installation? How are you trying to upload your files to Artifactory? Have you tried to import the content by using the import feature of Artifactory? (Admin --> Import&Export --> repository Import) It sounds like you are using the UI for the upload, and if so you can configure the max upload size in Admin --> General Configuration page.
  2. If you mean that you have all of the content from Bintray cached in your remote repository cache in Artifactory just use the "Copy" or "Move" option and move the content to a local repository. This will ensure that all of the content is stored locally.
Ariel
  • 3,406
  • 14
  • 17
  • 3
    Just to clarify, the copy/move operations are cheap copy/move as no actual content is being transferred but only pointers in the database. In order to copy/move the content just browse to the tree browser in the Artifactory UI and right click on the remote-repository-cache --> Copy/Move. – Ariel Jul 02 '17 at 06:51
  • This means that when you have a remote repo, and you try to migrate it to a local repo that it will only migrate the stuff which is in the cache of the remote repo? (It seems normal to me but not sure) – lvthillo Sep 18 '17 at 09:25
  • exactly, Artifactory can't "move" content that it doesn't have. So it will move only the cached artifacts. – Ariel Sep 18 '17 at 13:05
  • Thanks, we have an Artifactory OSS and an Artifactory PRO, we need to migrate a local repo from the oss, but we can't push the local repo to a local repo on our PRO Artifactory because it's OSS (doesnt have that feature). We can only pull (on PRO) but this is only possible for remote repositories. So it isn't possible to use that solution and after that move our remote repository on the Artifactory PRO to a local repository? – lvthillo Sep 18 '17 at 13:16
  • I know the import / export feature but the repo's need to be synced till the full migration of some jenkins jobs is finished too. – lvthillo Sep 18 '17 at 13:18
  • 1
    You can use the "incremental backup" option and import only the changed repository to the new instance. Meaning, run a backup process with an incremental option, once it's done, import the data to the new instance, after the full import will be done, run the incremental again, and import the changed repository. Does that makes sense? – Ariel Sep 18 '17 at 17:34
  • @Ariels, thanks I see, it's not a real import/export feature but really backup. I have one big repositories which contains multiple repo's actually. Not really the right approach maybe, but this means that the first time whole repo is backedup and the second time only the new packages which are now inside that repo? Automatically or is it only working on repository-level? – lvthillo Sep 19 '17 at 05:54
  • what we will have is: one repo with 100 different subfolders and packages. We will migrate that repo, adapt the jenkins jobs for 5 of the 100 subfolders, so 5 of them will contain new packages, the other ones are still created on the old repo. Than we will move the up to date state of 5 other packages and again adapt some Jenkins jobs to deploy on this new repo instead of the old one. So we need indeed an incremental new import + keeping the new packages which are created on the new repo. – lvthillo Sep 19 '17 at 06:00
  • 1
    Yes. that is exactly what the incremental means. If that solved your inquiry consider marking the answer as the right one so others will know that it worked for you. – Ariel Sep 19 '17 at 09:13
  • I've recreated this scenario. The only problem is: I redeploy an existing artifact in our new Artifactory. When I do a new import of our incremental backup from the old in my new artifactory it overwrites this redeployed artifact back to the content from our old Artifactory. It seems that it's not checking on checksum but only on filename? (example: new content for text1.txt on Artifactory but this file exists with other content in the old Artifactory, executing new import = old content of text1.txt is back in our new Artifactory - checksum was different). – lvthillo Sep 27 '17 at 09:43