multithreading - why java stream "reduce()" accumulates the same object -
comparisonresults comparisonresults = requestslist .parallelstream() .map(item -> getresponse(item)) .map(item -> comparetobl(item)) .reduce(new comparisonresults(), (result1, result2) -> { result1.addsingleresult(result2); return result1; });
when
private comparisonresults comparetobl(completeroutingresponseshort completeroutingresponseshortfresh) { ... comparisonresults comparisonresults = new comparisonresults(); ... return comparisonresults; }
however when debug:
.reduce(new comparisonresults(), (result1, result2) -> { result1.addsingleresult(result2); return result1; });
i see result1
, result2
same object (object id in idea)
result1
equals result2
addsingleresult
should return new object modified copy of this
should change code to:
.reduce(new comparisonresults(), (result1, result2) -> { return result1.addsingleresult(result2); });
otherwise, returning same instance (without modifications).
from java documentation:
the reduce operation returns new value. however, accumulator function returns new value every time processes element of stream. suppose want reduce elements of stream more complex object, such collection. might hinder performance of application. if reduce operation involves adding elements collection, every time accumulator function processes element, creates new collection includes element, inefficient. more efficient update existing collection instead. can stream.collect method, next section describes.
Comments
Post a Comment