This is an interesting issue that I couldn't identify the cause is from Curl
, or SCP
or Linux
/Hadoop
.
Current environment has following command to push a build to Linux/Hadoop environment.
curl -k -v scp://this.is.a.fake.url.com/linux/mount/drive/to/hadoop my-build.app
After providing the correct username and password, the build pushed successfully.
However when I check content of the build, it is a file from previous release (an old version that was uploaded before). It almost feels like there is buffer mechanism either by Curl or linux/hadoop to keep old build (which must be stored somewhere)
I also found an interesting observation that if I delete existing build in Hadoop/Linux before the CURL command, the issue never occur again. So the problem is when using Curl to upload and replace existing file, make fresh upload without existing file is always successful
Just wondering anyone had the similar experience.
CodePudding user response:
Well, HDFS files are read only. You don't modify files, you either append, or replace --> (create new, Delete old, rename to the same file name.) This is consistent with what you are seeing, and likely there is a lost error message in your mount tooling. (As it's not possible to modify it must be silently failing)