Well if the file would be created by hand, that’s very cumbersome.
But what is sometimes done to create it automatically is using
pip freeze > requirements. txt
inside your virtual environment.
You said I don’t need to create this file? How else will I distribute my environment so that it can be easily used? There are a lot of other standard, like setup.py etc, so it’s only one possibility. But the fact that there are multiple competing standard shows that how pip handles this is kinds bad.
I work with python professionally and would never do that. I add my actual imports to the requirements and if I forget I do it later as the package fails CI/CD tests.
If you try to keep your depencies low, it’s not very cumbersome. I usually do that.
A setup.py/pyproject.toml can replace requirements. txt, but it is for creating packages and does way more than just installing dependencies, so they are not really competing.
For scripts which have just 1 or 2 packges as depencies it’s also usuall to just tell people to run pip install .
Isn’t it called a requirements.txt because it’s used to export your project requirements (dependencies), not all packages installed in your local pip environment?
If newer versions are released and dependencies change you would still install the old dependencies. And if the dependencies are not stored you can’t reproduce the exact same environment.
cargo just works, it’s great and everyone loves it.
npm has a lot of issues but in general does the job. When docs say do ‘npm install X’ you do it and it works.
pip is a mess. In my experience doing ‘pip install X’ will maybe install something but it will not work because some dependencies will be screwed up. Using it to distribute software is pointless.
Really the fault of js since its standard library is so lacking (leftpad, anyone?), but js wasn’t built to do half the stuff it’s being asked to do, anyway.
It probably works for your own local project. After using it for couple of days to install some 3rd party tool my conclusion is that it has no idea about dependencies. It just downloads some dependencies in some random versions and than it never works. Completely useless.
Bruh idk why the difference… Educate me?
Pip stores everything inside of some random txt file that doesn’t differentiate between packages and dependencies.
Pip stores nothing in a text file
If you want to export your local environment, isn’t usually a requirements.txt used?
Yes, but this file is created by you and not pip. It’s not like package.json from npm. You don’t even need to create this file.
Well if the file would be created by hand, that’s very cumbersome.
But what is sometimes done to create it automatically is using
pip freeze > requirements. txt
inside your virtual environment.
You said I don’t need to create this file? How else will I distribute my environment so that it can be easily used? There are a lot of other standard, like setup.py etc, so it’s only one possibility. But the fact that there are multiple competing standard shows that how pip handles this is kinds bad.
I work with python professionally and would never do that. I add my actual imports to the requirements and if I forget I do it later as the package fails CI/CD tests.
If you try to keep your depencies low, it’s not very cumbersome. I usually do that.
A setup.py/pyproject.toml can replace requirements. txt, but it is for creating packages and does way more than just installing dependencies, so they are not really competing.
For scripts which have just 1 or 2 packges as depencies it’s also usuall to just tell people to run pip install .
Isn’t it called a requirements.txt because it’s used to export your project requirements (dependencies), not all packages installed in your local pip environment?
Honestly its a simple and straightforward solution. What’s wrong with it?
If newer versions are released and dependencies change you would still install the old dependencies. And if the dependencies are not stored you can’t reproduce the exact same environment.
cargo just works, it’s great and everyone loves it.
npm has a lot of issues but in general does the job. When docs say do ‘npm install X’ you do it and it works.
pip is a mess. In my experience doing ‘pip install X’ will maybe install something but it will not work because some dependencies will be screwed up. Using it to distribute software is pointless.
I use pip extensively and have zero issues.
npm pulls in a million dependencies for even the simplest functionality.
Is that really the fault of the package manager or is it of the libraries you decide to use?
Fault of the libraries you decide to use, i.e. any and all node libraries
Well if that’s the case, can’t argue with that lol
Really the fault of js since its standard library is so lacking (leftpad, anyone?), but js wasn’t built to do half the stuff it’s being asked to do, anyway.
You’ve never had broken dependencies?
Nope. I know mixing pip with python packages installed through your systems package manager can be a problem but that’s why I containerize everything.
I separate everything in virtual environments myself, but in my opinion you shouldn’t need to that to simply avoid breaking your system.
It probably works for your own local project. After using it for couple of days to install some 3rd party tool my conclusion is that it has no idea about dependencies. It just downloads some dependencies in some random versions and than it never works. Completely useless.
This is clearly a layer 8 issue lmao.