0

I have a java project where the result distribution needs two other services running: zookeeper and project voldemort. I would like to have a build process that outputs the right version of each project with an already valid configuration, so that publishing a new environment will be a matter of getting the pieces from the build and start then on the production servers.

I use version control system for the project but these two dependencies are a bit large to put in. I thought to keep then in a internal ftp server and then download at the build time to the dist folder.

Is this a good practice? Has anyone already done something like this?

Gray
  • 115,027
  • 24
  • 293
  • 354
  • 1
    I'm not sure I grasp the question. Are you asking if the use of the FTP server is standard? When you deploy the primary project will you ALWAYS need new deployments of zookeeper and voldemort? What problem do you have now that you're trying to solve? – skiller3 Mar 13 '12 at 22:34
  • New installations may happen more frequently in the begin, when spreading to new countries, and in this time I'll need zookeeper and voldemort installed too. I want to build and have tree things: (1) My project compiled and compressed, (2) a compressed configured zookeeper installation, and (3) a compressed configured voldemort installation. Has anyone already done this? If so, what is the approach you took. – Diego Oliveira Mar 14 '12 at 03:15

1 Answers1

0

At least for zookeeper, there are examples where people have embedded zookeeper in memory into their java process.

Generally though, this is a fairly common distribution model where dependent services are specified in a manifest and the install/start system uses a common startup configuration file(containing elements such as hostnames, ports, userids) to generate specific configuration files and starts each of the services up. I dont know of any open source systems off hand, but I use a in house built system for this in my company.

manku
  • 1,268
  • 10
  • 9