I have a situation where I pull a file from a folder via a Camel SFTP route. There are 3 other Camel routes which need to process the same file which I am pulling... the other routes monitor the same SFTP server.
Currently, we are thinking that we should make that file which I pull, available to the 3 other routes by placing a copy of the file in 3 separate folders where each folder is monitored by one of those other 3 routes... one unique folder for each of the 3 routes.
When the other routes detect the presence of the file I would copy, they would fire up and proceed as required.
The first question is whether this is a reasonable means of handling the scenario I just described or whether there is a better way.
Secondly, assuming the scenario is reasonable, the question is what is the best way to handle propagating the file I pull to 3 different folders on the same SFTP server.
For what it's worth, we are using Spring XML DSL.
CodePudding user response:
I think you should decouple the scanning of the files from the processing of the files. In particular, there is maybe no real need to read a same file multiple times on the SFTP server, but only to process its content multiple times.
This could end up to something like:
from("sftp://server/folder1")
to("direct:processing1");
from("sftp://server/folder2")
to("direct:processing2");
from("sftp://server/folder3")
to("direct:processing3");
from("sftp://server/toplevel")
.multicast().parallelProcessing()
.to("direct:processing1")
.to("direct:processing2")
.to("direct:processing3")
.end();