State of Python Packaging: Buildout, Distribute, Distutils, EasyInstall, etc?

Distribute is a new fork of setuptools ( easy_install ), which should also be considered. Even Guido recommends it.

Distribute is a new fork of setuptools (easy_install), which should also be considered. Even Guido recommends it. Buildout is orthogonal to the packaging --- you can use buildout with distribute.

First of all, regardless of installation tool you decide on, start using virtualenv --no-site-packages! That way, python packages are not installed globally and you can easily get back to where you were in old as well as new projects. Now, your comparison is a little bit apples-to-pears as the tools you list are not mutually exclusive.

However, I can wholly recommend Buildout. It will install python packages as well as other stuff and lets you automate installation and deployment of (complex) projects. Also, I recommend looking into Fabric as a means to automate administrative tasks.

I've done quiet a bit of research on this topic(a couple of weeks worth) before settling down on using buildout for all of my projects. DistUtils and EasyInstall in addition to Buildout! The difficulty in creating one place to compare all of these tools is that they're all part of a same tool chain and are used together to create a predictable, reliable and flexible tool set.

For example, easy_install is used to install distutils packages from pypi(cheeseshop) to your system Python's site-packages directory. This drastically simplifies installation of packages to your system/global sys.path. Easy_install is very convenient for packages that are consistent for all projects.

But, I find that I prefer to use system's easy_install to install packages that projects do not depend on. For example, github-cli I use with every project, because it allows me to interact with project's Github Issues from command line. I use this with projects, but it's for convenience and the project itself does not have dependancy on this package.

For managing project's dependancies, I use buildout. Buildout allows you to indicate specifically what version of packages your project depends on. I prefer buildout over pip-requirements.

Txt because buildout is declarative. With pip, you install the packages and at the end of the development you generate the requirements. Txt file.

With Buildout on the other hand, you modify the buildout. Cfg before the package egg is added to your project. This forces me to be conscious of what packages I'm adding to the project.

Now, there is a matter of virtualenv. One of the most publicized features of virtualenv is obviously --no-site-packages option. I have not found that option to be particularly useful, because I use buildout.

Buildout manages the sys. Path and includes only the packages I ask tell it to include. It also, includes everything in system Python's site-packages but since I don't have anything there that I use in projects, I never have conflicts.

Also, I find that --no-site-packages only hinders my development process, because some packages I install using my sistem's packaging system. Usually, anything that has C libraries that need to be compiled, I install through the system's packaging system. In the project's fabfile.Py I include test function to test for presence of system packages that I install through system's package manager.

In summary, here is how I use these tools: System's Package Manager(apt-get, yam, port, fink ...) I use one of these to install python versions that I need on this system. I also use it to install packages like lxml which include c libraries. Easy_install I use to install packages from pypi that I use on all projects, but projects are not dependant on these packages.

Buildout I use to manage dependancies of a project. In my experience, this workflow has been very flexible, portable and easy to work with.

Whenever I need to remind myself of the state of play, I look at these as a starting point: The State of Python Packaging, a response to: On packaging, linked from: Tools of the Modern Python Hacker.

I can't easily help you with finding the strength, but I can make it a bit harder, since it also depends on the platform you want to use. For example if you need to install python packages on Gentoo (GNU/Liunx) based computers, you can easily use g-pypi to create ebuilds for all packages which use distutils (rather: a setup. Py).

That way they get completely integrated into your system and can be added, updated and removed like all your other tools. But it naturally only works for Gentoo-based systems. Also you can use yolk to find out about all packages installed via easy_install on your system (not only on Gentoo).

When I write code, I simply use distutils (because it allows building portage ebuilds very easily) and sometimes basic setuptools features, or organize my programs so people can just download and run them from the program folder (ideally just unpack the source archive / clone the repository somewhere). This isn't the perfect solution, but until the core python team decides which way they want to move, I don't want to fix onto a path (anymore) which might disappear.

I cant really gove you an answer,but what I can give you is a way to a solution, that is you have to find the anglde that you relate to or peaks your interest. A good paper is one that people get drawn into because it reaches them ln some way.As for me WW11 to me, I think of the holocaust and the effect it had on the survivors, their families and those who stood by and did nothing until it was too late.

Related Questions