I think it's logically monotonic in the sense that as you add input data to your query, the output simply accumulates (you never have to retract any prior conclusions).
Consistency between two accumulators is easy, they just sum their states.
any non-monotonic computation can be split into monotonic, where they process the bulk of the data, and folding the results to product the final result
Min is also logically monotonic in the same sense though, right? As a min accumulator I can throw away all past information once I've computed my result and never have consistency issues in light of new data.
A better example would be reachability analysis in a DAG - here you really do have monotonicity failure when merging results from multiple workers.