11

Suppose I have a project "MyFramework" that has some code, which is used across quite a few solutions. Each solution has its own source control management (SVN).

MyFramework is an internal product and doesn't have a formal release schedule, and same goes for the solutions.

I'd prefer not having to build and copy the DLLs to all 12 projects, i.e. new developers should to be able to just do a svn-checkout, and get to work.

What is the best way to share MyFramework across all these solutions?

Rok Strniša
  • 6,781
  • 6
  • 41
  • 53
Kyle West
  • 8,934
  • 13
  • 65
  • 97

7 Answers7

7

Since you mention SVN, you could use externals to "import" the framework project into the working copy of each solution that uses it. This would lead to a layout like this:

C:\Projects
  MyFramework
    MyFramework.csproj
    <MyFramework files>

  SolutionA
    SolutionA.sln
    ProjectA1
      <ProjectA1 files>
    MyFramework   <-- this is a svn:externals definition to "import" MyFramework
      MyFramework.csproj
      <MyFramework files>

With this solution, you have the source code of MyFramework available in each solution that uses it. The advantage is, that you can change the source code of MyFramework from within each of these solutions (without having to switch to a different project).

BUT: at the same time this is also a huge disadvantage, since it makes it very easy to break MyFramwork for some solutions when modifiying it for another.

For this reason, I have recently dropped that approach and am now treating our framework projects as a completely separate solution/product (with their own release-schedule). All other solutions then include a specific version of the binaries of the framework projects.

This ensures that a change made to the framework libraries does not break any solution that is reusing a library. For each solution, I can now decide when I want to update to a newer version of the framework libraries.

M4N
  • 94,805
  • 45
  • 217
  • 260
  • I like this approach, I'll have to read more about externals. I'm assuming you can change everything and commit to the right repo when you're done. The framework is very young (we recently realized we were writing the same code all over the place) so being able to make quick changes is high on the priority list. Making sure those changes don't also break everything else is also important. – Kyle West May 29 '09 at 13:58
4

That sounds like a disaster... how do you cope with developers undoing/breaking the work of others...

If I were you, I'd put MyFrameWork in a completely seperate solution. When a developer wants to develop one of the 12 projects, he opens that project solution in one IDE & opens MyFrameWork in a seperate IDE.

If you strong name your MyFramework Assemby & GAC it, and reference it in your other projects, then the "Copying DLLs" won't be an issue.

You just Build MyFrameWork (and a PostBuild event can run GacUtil to put it in the asssembly cache) and then Build your other Project.

Eoin Campbell
  • 43,500
  • 17
  • 101
  • 157
  • that is certainly a consideration, but we do have CI on all projects so at least we'd find it soon. As far as GAC, I'd like to avoid the GAC if at all possible. – Kyle West May 29 '09 at 13:55
2

The "best way" will depend on your environment. I worked in a TFS-based, continuous integration environment, where the nightly build deployed the binaries to a share. All the dependent projects referred to the share. When this got slow, I built some tools to permit developers to have a local copy of the shared binaries, without changing the project files.

John Saunders
  • 160,644
  • 26
  • 247
  • 397
2

Does work in any of the 12 solutions regularly require changes to the "framework" code?

If so your framework is probably new and just being created, so I'd just include the framework project in all of the solutions. After all, if work dictates that you have to change the framework code, it should be easy to do so.

Since changes in the framework made from one solution will affect all the other solutions, breaks will happen, and you will have to deal with them.

Once you rarely have to change the framework as you work in the solutions (this should be your goal) then I'd include a reference to a framework dll instead, and update the dll in each solution only as needed.

Console
  • 981
  • 7
  • 14
2

svn:externals will take care of this nicely if you follow a few rules.

First, it's safer if you use relative URIs (starting with a ^ character) for svn:externals definitions and put the projects in the same repository if possible. This way the definitions will remain valid even if the subversion server is moved to a new URL.

Second, make sure you follow the following hint from the SVN book. Use PEG-REVs in your svn:externals definitions to avoid random breakage and unstable tags:

You should seriously consider using explicit revision numbers in all of your externals definitions. Doing so means that you get to decide when to pull down a different snapshot of external information, and exactly which snapshot to pull. Besides avoiding the surprise of getting changes to third-party repositories that you might not have any control over, using explicit revision numbers also means that as you backdate your working copy to a previous revision, your externals definitions will also revert to the way they looked in that previous revision ...

Wim Coenen
  • 66,094
  • 13
  • 157
  • 251
  • Other answers mention that using svn:externals might easily break other solutions. I want to point out (again), that using explicit revision numbers can eliminate this problem. Using a specific rev should lead to the same result as copying a specific binary into the solution, plus you get easier debugging and no copying of binaries around. I could imagine using the current rev of MyFramework on the trunk of all solutions and when I create a RC for a solution I will pin the external to a specific rev of MyFramework. – Tom Oct 10 '17 at 15:57
1

A scalable solution is to do svn-external on the solution directory so that your imported projects appear parallel to your other projects. Reasons for this are given below.

Using a separate sub-directory for "imported" projects, e.g. externals, via svn-external seems like a good idea until you have non-trivial dependencies between projects. For example, suppose project A depends on project on project B, and project B on project C. If you then have a solution S with project A, you'll end up with the following directory structure:

# BAD SOLUTION #
S
+---S.sln
+---A
|   \---A.csproj
\---externals
    +---B     <--- A's dependency
    |   \---B.csproj
    \---externals
        \---C     <--- B's dependency
            \---C.csproj

Using this technique, you may even end up having multiple copies of a single project in your tree. This is clearly not what you want.

Furthermore, if your projects use NuGet dependencies, they normally get loaded within packages top-level directory. This means that NuGet references of projects within externals sub-directory will be broken.

Also, if you use Git in addition to SVN, a recommended way of tracking changes is to have a separate Git repository for each project, and then a separate Git repository for the solution that uses git submodule for the projects within. If a Git submodule is not an immediate sub-directory of the parent module, then Git submodule command will make a clone that is an immediate sub-directory.

Another benefit of having all projects on the same layer is that you can then create a "super-solution", which contains projects from all of your solutions (tracked via Git or svn-external), which in turn allows you to check with a single Solution-rebuild that any change you made to a single project is consistent with all other projects.

# GOOD SOLUTION #
S
+---S.sln
+---A
|   \---A.csproj
+---B     <--- A's dependency
|   \---B.csproj
\---C     <--- B's dependency
    \---C.csproj
Rok Strniša
  • 6,781
  • 6
  • 41
  • 53
1

I agree with another poster - that sounds like trouble. But if you can't want to do it the "right way" I can think of two other ways to do it. We used something similar to number 1 below. (for native C++ app)

  1. a script or batch file or other process that is run that does a get and a build of the dependency. (just once) This is built/executed only if there are no changes in the repo. You will need to know what tag/branch/version to get. You can use a bat file as a prebuild step in your project files.

  2. Keep the binaries in the repo (not a good idea). Even in this case the dependent projects have to do a get and have to know about what version to get.

Eventually what we tried to do for our project(s) was mimic how we use and refer to 3rd party libraries.

What you can do is create a release package for the dependency that sets up a path env variable to itself. I would allow multiple versions of it to exist on the machine and then the dependent projects link/reference specific versions.

Something like

$(PROJ_A_ROOT) = c:\mystuff\libraryA

$(PROJ_A_VER_X) = %PROJ_A_ROOT%\VER_X

and then reference the version you want in the dependent solutions either by specific name, or using the version env var.

Not pretty, but it works.

GEOCHET
  • 21,119
  • 15
  • 74
  • 98
Tim
  • 20,184
  • 24
  • 117
  • 214