Is it acceptable/good to store binaries in SVN?

The two common reasons you may want to store binaries in a Version Control System are: store external third-party libraries Usually one stores them into a Maven repository, but to store them into SVN allows you to have one and only one referential for all your need: get your sources, and get your libraries you need to compile those sources. All comes from one repository store deliveries for quicker deployment Usually deliveries (the executable you build to deploy into production) are built on demand But if you have many pre-production environment, and if you have many deliveries, the cost of building them for assembly, integration, homologation, pre-production platforms can be high A solution is to build them once, store them in a deliveries section of your SVN, and use them directly in your different environment Note: This apply also to development elements: if you have a Jaxb process which generates 900 POJO files (through XML binding), and you need to download that development set in multiple environments, you may want 1 compressed file copy transaction, rather than 900 ones So yes, it is "acceptable/good to store runtime binaries in the SVN"... for the right reasons.

The two common reasons you may want to store binaries in a Version Control System are: store external third-party libraries. Usually one stores them into a Maven repository, but to store them into SVN allows you to have one and only one referential for all your need: get your sources, and get your libraries you need to compile those sources. All comes from one repository.

Store deliveries for quicker deployment. Usually deliveries (the executable you build to deploy into production) are built on demand. But if you have many pre-production environment, and if you have many deliveries, the cost of building them for assembly, integration, homologation, pre-production platforms can be high.

A solution is to build them once, store them in a deliveries section of your SVN, and use them directly in your different environment. Note: This apply also to development elements: if you have a Jaxb process which generates 900 POJO files (through XML binding), and you need to download that development set in multiple environments, you may want 1 compressed file copy transaction, rather than 900 ones. So yes, it is "acceptable/good to store runtime binaries in the SVN"... for the right reasons.

Also, if you can't use Maven or prefer ant, then storing the libraries in your repository where they can easily be checked out for an ant build makes sense. – fluffels Feb 10 '09 at 8:32 @fluffels: agreed. – VonC Feb 10 '09 at 8:38 a 3rd common reason is for storing bitmapped graphics, audio and video.

These need to be version controlled too. Certain versions of the same video may only be appropriate for a certain version of the source code that uses it. – Rob Dec 17 '10 at 10:59 1 @Rob: a/ if they changed often and b/ if you need to retrieve an old version, then yes.

If not, other repositories (not VCS-based) like Nexus will ensure some history for those elements, and you will be able to remove them from said repository much more easily than in a VCS. – VonC Dec 17 '10 at 11:47 +1 for the Nexus introduction, an idea I should explore myself. – Rob Dec 20 '10 at 13:42.

I would say that if it makes your team's lives easier, then do it. If it lessens the time taken to set up a working development environment, go for it.

Yes, store it. We used to store the binaries we delivered to customers in the SVN repository to keep track of it. Also another use of storing the binaries in SVN (or source control) is if you are providing some internal utility modules to other teams in your company who don't want to build your project to save their build time.

I believe it's a common practice. But we never allowed to store . Classpath and .

Project files of Eclipse (workspace related settings).

But we never allowed to store . Classpath and . Project files of Eclipse (workspace related settings).": Argh!

Too bad. I think it can be useful ;) See stackoverflow.Com/questions/337304 – VonC Feb 10 '09 at 8:35 And stackoverflow. Com/questions/116121#119377, and stackoverflow.

Com/questions/300328 – VonC Feb 10 '09 at 8:37 @VonC: Thanks for the links, your answers are interesting. I'd still prefer to keep most of these files outside of the source control, mostly in a wiki. But I'll think about it :) – artknish Feb 10 '09 at 9:45.

It's perfectly fine and acceptable to store binaries in the SVN repo. As a sidenote, I can't see why would anyone want to store build artifacts in the repository (I'm not saying you do that).

We have one or two little programs that rarely change, and are an absolute bugger to get compiled, needing a specialised environment and what-have-you. We store them in SVN for the sake of simplicity/sanity. – Evan Feb 10 '09 at 8:38 Actually, now that I think about it, we also have some artifacts that cannot be built on the command line or in an automated way, and we often vacillate over the question of it being worthwhile putting them in the repository (SWF.

Not for this purpose, no. You should use an external file store, like an FTP or Web server. This way it is easy to download a particular version of your runtime binary without having to update to that revision in SVN first.

If you're developing in Java, then you can set up a local repository and then use a tool like maven or ivy+ant to access it. You can upload updates of your local build artifacts back to your local repository as they are ready for others in the company to use. For other development environments, I don't know what tools similar to the above are available - I have tended to just put them in SVN and be done with it.

I usually use a separate repository for storing third-party libraries to keep them out of the regular development repositories, and have my build files load them in an expected location relative to the project's base folder. Actually, I use two repositories. One for the minimal files that I need for building my projects (e.g. , jar, lib files) and another for the entire third-party package (including the source, documentation or whatever) which I usually store tar.

Bz2. That way, if you just want to get the minimum you need to build stuff, you grab the first repository, and if you need to figure out what is going on with, or how to use a third-party package you can start pulling stuff out of the second repository.It ain't the ideal solution, but it works pretty well. Here is some more information on how svn handles binary files.

As many have already said, it's acceptable. Yes, it is convenient to have everything handy from one location, from where you can (for example) checkout an older tag already in binary form with its correct dependencies. But it is NOT good, especially for backup purposes.

We stored all our binaries (and part of the dependencies) in SVN and as the project grew, so that binary section did. Unfortunately, svnadmin dump just dumps everything, you cannot specify a path of the repository to exclude. Thus, backups (and upgrades of the svn server) became very painful!

If you add that after a not-so-long time in our case those binaries were not useful anymore, I'm sure I will not do that again in a similar case (but I would do for a smaller project). So I would recommend to think twice before doing that and try to forecast how big can you grow and what else might happen.

I would let my build and continuous integration system handle the latest working version of things, by automatically copying them to an FTP, web or file share for easy access. Even better I would invest in a CI system that automatically handles build artifacts, I love TeamCity from jetbrains myself but there are others. This way you can handle it fully automatic.

Storing binaries under version control is perhaps defeating the purpose of version control. You are better off using HTTP/FTP..This discussion on SO at stackoverflow.com/questions/104453/versi... might be useful!

Our Java . Jar file builds were binding in their . Jar file dependencies, which we were checking into svn.

A lot of this was redundant in practice, but we wanted to insure every Java app build we produced had precisely the libraries it underwent QA with. What really aggravated me, though, with this approach was when I started doing remote connections to the repository and syncing. Would take forever to just churn through all the binary libraries.

We've since abandoned that practice and now use Maven to manage library dependencies - even for projects that we're still building with ant. No more binaries being checked into svn. Life is much better on several fronts because of this shift of strategy.

And we have the rigorous control over versions of library dependencies that we desired. For our . NET builds, one of my developers has devised a solution that works in large part like Maven in respect to all the dependency management stuff, and is achieving much the same benefit there too.

(usually you have axis. Jar and not axis-1.4.Jar) why was it included? (especially tricky with dependencies of dependencies) If you don't have a dependency management system in place, you normally can't answer both questions.

And it's the first step to Jar Hell. I can recommend Apache Ivy (other may swear by Maven) with an intranet repository. Using Ivy, I never had to store libraries into SVN and could always answer the above mentioned questions.

Store binaries that not everyone can build. I design chips in Verilog and VHDL and the software team doesn't have those tools. So we store the output binaries in SCM.

There's some contention on this matter but I say yes.

At least I store the binaries in the SVN, this way I can quickly revert to the particular version binary and see whether the bug was happening in it or not and trace the version, where the bug was introduced, rather then checking out the whole project, set up all the particular project related and environment settings, and then compile it.

No, don't store binaries next to their source code (unless you have good reasons that offset the disadvantages). Disadvantages: it encourages bad build practices in large projects. The best practice is to fully automate your build.

Committing binaries enables you to ignore that: "just manually do the build for the parts that have changed" :-( slower updates and commits commits which change source code but not the corresponding binaries will cause confusion among developers. How do you detect that there is a mismatch? Svn update will update the timestamp of your binaries, confusing your build tools which will erroneously think the binaries are newer than your source code changes.It uses more disk space in the repository.

(This may be negligible depending on your project. ) In general, avoid committing anything that is generated automatically in a deterministic way from other versioned resources.No redundancy -> no opportunity for inconsistencies. Instead, use a continuous integration server to automatically rebuild (and run the tests) on each commit.

Let this build server publish the binaries somewhere (outside SVN) if necessary. Now, this does not mean that you should take this as an absolute rule or avoid all binaries. By all means, put your build tools and third-party closed-source binaries inside your projects.

And make exceptions when it makes sense. Ideally a new developer should be able to do a check out and immediately launch the build script without spending a day or two on setting up his environment.

Jar, lib files) and another for the entire third-party package (including the source, documentation or whatever) which I usually store tar. That way, if you just want to get the minimum you need to build stuff, you grab the first repository, and if you need to figure out what is going on with, or how to use a third-party package you can start pulling stuff out of the second repository. It ain't the ideal solution, but it works pretty well.

Here is some more information on how svn handles binary files.

I cant really gove you an answer,but what I can give you is a way to a solution, that is you have to find the anglde that you relate to or peaks your interest. A good paper is one that people get drawn into because it reaches them ln some way.As for me WW11 to me, I think of the holocaust and the effect it had on the survivors, their families and those who stood by and did nothing until it was too late.

Related Questions