is it possible to download all repo folder with curl or wget?. i can do it like individual, curl -X GET -u user:pass http://IP:8081/repository/somefile.xx -O but there are hundreds plus and it is updated every week.
CodePudding user response:
If http://10.20.42.52:8081/repository
provide listing of files, then you might harness --recursive
option of wget
as follows
wget --user=user --password=pass --recursive -np -nc http://10.20.42.52:8081/repository
Explanation: --recursive
does get links present in file available under given URL download them, get links present in that donwloads, download them and so on, -np
get only files below in hierarchy (for example if http://10.20.42.52:8081/repository
contains link to http://10.20.42.52:8081
it will not be downloaded), -nc
do not download if file is already present locally. Read wget
man page if you want to know more about mentioned options.
Disclaimer: code is not tested as I do not have access to http://10.20.42.52:8081/repository
CodePudding user response:
I have solved a write little shell script it is taken from assets like this page: https://help.sonatype.com/repomanager3/rest-and-integration-api/assets-api and then with a for loop is automatically download all rpms. Thanks to your answer.