Home > database >  Creating requirements.txt in GitLab
Creating requirements.txt in GitLab

Time:12-14

Probably a silly question, but I am trying to set up a project in GitLab that is going to be used for deployment of an ML model, for which I will use FastAPI. I'm very new to this and will try to provide as much info as possible.

I created the project in GitLab, which right now only contains a README.md file. The actual Python code is stored in a folder on my computer ("MyProject"), which contains two folders, each of which containing some data, .py scripts and a notebook.

To set up requirements.txt, I tried to create a virtual environment in Windows. Now, when I open the "MyProject" folder, it contains those two folders with code and the virtual enviroment, which also contains Lib, Scripts, pyvenv.cfg. The latter contains:

home = c:\users\me\anaconda3
implementation = CPython
version_info = 3.8.5.final.0
virtualenv = 20.10.0
include-system-site-packages = false
base-prefix = c:\users\me\anaconda3
base-exec-prefix = c:\users\me\anaconda3
base-executable = c:\users\me\anaconda3\python.exe

I also cloned the GitLab repo, but on my computer it is saved somewhere else (in c:\users\me). I know that I need to do:

pip install -r
requirements.txt

But I don't understand how to actually add this requirements file. All of the packages and libraries that I needed for my ML model were installed a long time ago with anaconda, before I created this virtual environment. Have I done anything wrong?

CodePudding user response:

I think you mixed up some things. GitLab uses Git for version control of your files (your code). Therefore your repository should contain the files with your code. You can just put the files of your folder "MyProject" into the folder, where you cloned the repository to. Also add the requirements.txt the readme-file and so on.

The virtual environment is used to keep your system installation of Python clean and only have the necessary packages installed for each project. Among other things to avoid package requirement conflicts. The usage of an requirements.txt file is independet of the virtual environment, even if it is a sensible combination.

In general this means, your requirements.txt is always shared together with your code, because it lays within the same repository. When someone clones the repository, he can use the requirements.txt to install all the dependencies to his venv (or somewherer else) and then run your code without the nedd to install further python packages.

Your requirements.txt file has to contain columns, which look like this: numpy==1.21.4. Then you have to activate the environment with <your path to the venv folder>\venv\Scripts\activate and use python -m pip install -r requirements.txt to install the packages listed in your requirements.txt.

CodePudding user response:

You can create requirements.txt using pip freeze > requirements.txt and add it to your repo. This will generate list of your installed packages and exact versions you have.

https://pip.pypa.io/en/stable/cli/pip_freeze/

CodePudding user response:

Simple solution would be pip freeze > requirements.txt but this command will add all the packages present in your enviroment which may not be used in your project. In my daily job, I use this https://pypi.org/project/pipreqs/. you can install it and run pipreqs --force in your project folder. This will add packages to requirements.txt which are used in your project.

CodePudding user response:

how to actually add this requirements file

You create the file yourself. For every library that you use, add a line in requirements.txt with the name of the library. Also see documentation https://pip.pypa.io/en/stable/reference/requirements-file-format/

After creating the file, commit and push it to the git repository.

All of the packages and libraries that I needed for my ML model were installed a long time ago with anaconda

Gitlab-CI with docker executor starts with a fresh environment. You have to repeat all installation steps that you did on your workstation inside the docker environment. You can run docker instance of the container locally for testing. Consult gitalb-ci and docker documentations.

  • Related