I have code files from dozens of git repos in various subfolders under c:\code folder - 16Gb.
I want to migrate this folder to another computer. It's currently taking > 1 day to copy the entire folder to a USB drive, because it is around 650,000 small files.
Is there some script I can run to cleanup all of the repos in my c:\code folder?
Edit: all of the repos have a remote. I don't care about copying all branches. I only care about keeping the directory structure of the repos, i.e.
c:\code\github\NLog
c:\code\github\Swashbuckle.AspNetCore
c:\code\myclient\DevOpsProject1\solution1
c:\code\myclient\DevOpsProject1\solution2
c:\code\myclient\DevOpsProject2\solutionx
etc
CodePudding user response:
You can use git bundle
to bundle each of the repositories full history into one file (per repository)
And you can zip the dozen of bundle into a giant tar file.
Result: only one (big) file to copy, and to untar.
You can then clone back your repositories from their respective bundle (cloning them from their bundle file).
I don't care about copying all branches. I only care about keeping the directory structure of the repos
The, an alternative approach is to simply tar cpvf code.tar code
under C:\
.
Copy the giant tar file to the target machine, and tar xpvf code.tar
: the directory structure will be preserved.
A bit as in here:
find . -name "*.git" -type d -exec tar -czf {}.tar.z {} \; -exec rm {} \;
(Be careful with the -exec rm
part: test it out first).
CodePudding user response:
What I was originally looking for:
find . -name .git -type d -execdir git clean -dxf \;
This cleaned up my 16Gb of files across all repositories down to 800Mb. And then @VonC's answer
tar cpvf code.tar code
and then on the destination machine:
tar xpvf code.tar
CodePudding user response:
You clone only lasted commit from your remote repository like
git clone --depth 1 https://url-of-your-repo