ActiveState Platform Demo: Virtualenv vs Pipenv
For example, previously, in order to create virtual environments so you could run multiple projects on the same computer you’d need:
- A tool for creating a virtual environment (like virtualenv or venv)
- A utility for installing packages (like pip or easy_install)
- A tool/utility for managing virtual environments (like virtualenvwrapper or pyenv)
- All the commands associated with the libraries used
Pipenv includes all of the above, and more, out of the box. It essentially gives you package management and virtual environment support in a single tool. Learn more in this blog.
So the virtual environment that I was mentioning, if you don’t know, is a self-contained sandbox environment just for your application. So this is a tiny little sandbox with a Python install and nothing else in it. It only has the packages that you specify, and they’re distinct from the system ones, so if you want just the stuff for your application, you can build it in that virtual environment.
They’re somewhat complex and weird to work with because they have weird names and all this kind of stuff, but they’re really really important for deployment and development. There is a tool called virtualenv to create and manage them, but there’s a new tool that combines pip and virtualenv together, since they’re really two sides of the same coin, and that’s PipEnv. This is the new “community standard” – the Python Software Foundation is endorsing this as the new community standard application – and this combines pip and virtualenv and extends their functionality in a single app.
You can install it here – I’ve already got it installed on mine – and what that will do is that you can then use it to initialize an environment. So if you wanted, you could do a clean Python 3 environment, just “pipenv -three”, but you can also use it to manage and install the dependencies for your application. If you have a pipfile – so a pipfile is like a requirements.txt but with a little bit more info. So we could generate one from the requirements.txt that we had, using the following command: pipenv install. Now, I already have the pipfile in the repo, I put it there for convenience, so we’ll just go take a look at it.
So if you look at the pipfile there, you can see that it has essentially the same – in the packages section here – essentially the same information that was in the requirements.txt, which is specifying those exact versions of Flask and NumPy and Tensorflow, but it also has additional sections here that give me more information. So if I didn’t have a pipfile, I could run “pipenv install” and it would generate a pipfile from my requirements.txt, and then it actually will go and create a virtual environment and install the dependencies in there.
If we go back here, you see that there’s the pipfile. So one of the extra functionalities that it has is, you’ll notice, that it allows you to designate the source. So this is where I’m getting the information on my packages from, in this case it’s PyPI. But I also have the ability to have different sets of packages for development and then for production. I can also specify things like what version of Python that it requires. So it gives me a little bit more control than what’s in the requirements.txt and it gives me a little bit more flexibility, designed to capture all the possible use cases.
So then, the next stage is generating a lock file. What we want to do is, this is the fully resolved dependency tree for a project. So in this case we would run “pipenv lock” on there, and this is required for a deterministic build because, like I said before, if we don’t run this, then we have an issue with possible dependencies of the packages that we have getting updated, all that kind of stuff. Now, I should give you a warning here that it is possible that pipenv lock may fail. It’s not guaranteed to succeed. It could fail to resolve a dependency conflict, and what that means is that if Package A depends on Package B and Package C depends on Package B, but they depend on different versions, there’s no obvious way for it to resolve that. And so what happens is that it will be like, you need to think about what’s going to happen here – whether that’s manually rolling the dice and updating one of the versions of the packages, or potentially patching out the the conflicting functionality. There’s all kinds of different ways to resolve it, but there’s no guaranteed way for it to do it automatically for you, so that could that could fail.
So if we look at our pipfile.lock, it can take quite a while, so we have it there. It’s obviously a lot more complex – it’s got GRPC in there, it has a whole bunch of hashes for known, good versions, it’s got all of the dependencies for all these things. So this is really, right here, all the information that we need to build a deterministic build. We need to be able to install protobuf and all this other stuff that wasn’t in our pipfile, so we need to make sure that we generate a log file.