I'm working on creating a local repository that will contain all packages I use in my project, so I can have those packages installed on a machine that does not have access to the internet. I think of the repository that I could clone on the machine and run yarn install
to have all the packages available in the project from the local repository. How can I do that? Similar question was asked here Using npm how can I download a package as a zip with all of its dependencies included in the package
CodePudding user response:
There's not enough information in your question to fully understand your situation, but if you commit your node_modules
directory to the repository, the modules will be there without the user having to run npm
or yarn
to install them. This assumes the user will run code from the repo workspace and that there aren't any modules that require a compilation step or other build step that may be platform-specific. But if they're all plain ol' JavaScript modules, you should be fine.
If you want to have all the modules as a separate repo rather than checking in node_modules
, I can offhand think of two ways this might work.
- Have the packages repo be a check-in of a fully installed
node_modules
directory. Then make that repo a Git submodule of the main repo that gets cloned asnode_modules
in the main repo. - Use
npm pack
to create .tgz files for each package you need. Store those files in the packages repo. Clone that repo into a known path on your target machine. Have the main repo install via path names. For example, if you runnpm install /var/packages/foo-1.0.0.tgz
, it will add a line to yourpackage.json
that might look something like this:"foo": "file:../../../var/packages/foo-1.0.0.tgz"
. In that case,npm install
will install from that path rather than over the network.