The benefits mercurial would have are threefold You can push an additional head to the central repository if you really want to (but probably it's a bad idea). Subversion doesn't have the concept of "heads" so you can't do that You can do iterative merge+commit. So, you'd pull the latest head from the "trunk" repository.
You'd merge in your change. You'd commit that locally. Then, you'd pull any additional changes from the trunk, and merge them as a new changeset on top of your previous merge changeset.
With subversion you can't do that - you have to keep on trying to get the merge done in 1 hit The merging tools are smarter (as dahlbyk has said) so these sorts of merges are generally easier to do But it's still not going to be perfect - complex merges are hard, not (generally) because the VCS is lacking, but because you need to apply your brain to work out how to take the right pieces from each person (or group)'s work. That's hard to automate, so it's slow and sometimes painful work Better development processes and communication can help, but if you have multiple groups doing relatively independent work on the same files then you've got to deal with merging somewhere.
The benefits mercurial would have are threefold. You can push an additional head to the central repository if you really want to (but probably it's a bad idea). Subversion doesn't have the concept of "heads" so you can't do that.
You can do iterative merge+commit. So, you'd pull the latest head from the "trunk" repository. You'd merge in your change.
You'd commit that locally. Then, you'd pull any additional changes from the trunk, and merge them as a new changeset on top of your previous merge changeset. With subversion you can't do that - you have to keep on trying to get the merge done in 1 hit.
The merging tools are smarter (as dahlbyk has said) so these sorts of merges are generally easier to do. But it's still not going to be perfect - complex merges are hard, not (generally) because the VCS is lacking, but because you need to apply your brain to work out how to take the right pieces from each person (or group)'s work. That's hard to automate, so it's slow and sometimes painful work.
Better development processes and communication can help, but if you have multiple groups doing relatively independent work on the same files then you've got to deal with merging somewhere.
By far the biggest benefit from my perspective is that the person making the changes should merge the latest changes from the master repository locally before pushing. Yes, you have the small chance of finishing your merge and then having to do another one if theres a newer commit on the master, but this isn't really a problem in practice because a) You don't lose all your previous merge work. All you have to merge is the changes that have happened since your last pull.
, and b) depending on the activity level and number of client repos, you're not likely to actually run into this too often. If you do, you can simply increase your hierarchy another level and make your client repos group around smaller sub repos, which can be merged more easily into the master repo. This is really a strong benefit of DVCS.
Merge work isn't lost. Once you've merged a set of changes you don't have to do it again. GitHub model The GitHub model really shines for large distributed teams.
Instead of giving everyone access to write to the master repo, or repos, you give that right to a trusted few who respond to pull requests by pulling your merged changes into the master repo, and controlling any minor merges that need to happen. help.github.com/send-pull-requests.
1 When this happens, usually there are no or few additional conflicts so the second merge is quick. – Laurens Holst Nov 4 at 9:15.
The only one real robust solution to that problem is using "gatekeeper" model, either human or automatic. With human gatekeeper there is only one person responsible for landing finished stuff to trunk, all other developers never push directly to trunk and work only with their own feature branches. Usually gatekeeper is the team lead or responsible accurate person.
With automatic gatekeeper there is special (server) software that keeps queue of merge tasks and do merges when it has new requests. Every successful merge then will be pushed to trunk. Failed merges (because of conflicts) will be rejected and the author of such branch should merge latest changes from trunk and therefore resolve conflicts on its side before it will repeat merge request to the gatekeeper.
You can read more about idea of gatekeepers from bzr documentation: doc.bazaar.canonical.com/bzr.2.4/en/user....
Because DVCS assume work will be done in various repositories then combined, merges are smarter than Subversion merges - your pull/merge/build cycle will be faster and less painful. But honestly, this sounds more like a people/process problem than a technology problem. Communication will help more than switching VCS.
I cant really gove you an answer,but what I can give you is a way to a solution, that is you have to find the anglde that you relate to or peaks your interest. A good paper is one that people get drawn into because it reaches them ln some way.As for me WW11 to me, I think of the holocaust and the effect it had on the survivors, their families and those who stood by and did nothing until it was too late.