I have a nested stream that I am using in order to process some data related to a requirement
final Map<String, Long> productModelQuantityReturnedMap = new HashMap<>();
sampleList.stream()
.map(sampleModel -> {
sampleModel.getEntries()
.stream()
.filter(entryModel -> Objects.nonNull(entryModel.getOrderEntry()))
.map(returnEntryModel -> {
sampleMap.put(
key,
val);
return null;
}).close();
return null;
}).close();
Yeah am aware that using for each or for would be ideal but yeah. I wanted to extend this to a parallel thread going forward since it has a lot of data points that don't depend on each other.
Although it says stream or the parallel stream will wait until the inner part is done this goes directly to the close()
of the outer stream which I don't understand why.
Can someone please help me on finding the issue here?
CodePudding user response:
map
and filter
are intermediate operations. A stream is not processed unless a terminal operation is invoked. Here in your code you are not invoking any terminal operation so that the stream is processed. Since close
is not a terminal operation, it does not process the stream. It just resets everything and ends the stream. You could refer AbstractPipeline
to see the source code for close
. Only thing that is executed is the code, if any, which was passed using the onClose
method. (i.e If you used onClose
on the stream, stream().onClose(() -> myCloseAction())....
).
Also, close
is not really required in the code which you have posted. As you mentioned in the question itself, you could use forEach
. It definitely works with parallel streams as well (forEach). Only thing is that your code seems to be updating a map. Make sure that is a concurrent map. Or it is much better to use toMap
or toConcurrentMap
from the Collectors
class instead of putting elements inside a map using forEach
.