So my pal and that i are attempting to perform a map reduce on the collection which has products being put into it consistently.

Essentially we calculate the typical of some fields and insert them in an assortment (via map reduce).

This is actually the problem, each time map reduce is went it is going through All the documents. I am a new comer to map reduce, but according to what Yes, it appears that it might be super efficient whether it only went map reduce around the new and/or modified documents increase all of them with the present collection.

And So I was like OK, I'll simply do it myself. Added a "processed: false" towards the collection, so when the map reduce runs I pass inside a query filter "" then following the map reduce runs Then i set "" to any or all the products where processed = false.

This is actually the problem. I'm concerned about edge situation. What goes on if throughout the map reduce some products are put into the gathering? These were never passed in to the map reduce, and today following the map reduce runs their processed flag is placed to true.

An amount do well is that if rather than passing inside a "query filter" into mongo, which i would have the ability to pass inside a query object "set" So i quickly can set the processed flag to true after which pass in individuals objects.

Turn it into a 3 step factor. Have 3 states, say Natural, MARKEDFORPROCESSING and PROCESSED, then :

  1. db.col.update(Natural, , false, true)
  2. Run m/r against MARKEDFORPROCESSING documents. They are certain to happen to be there at m/r start.
  3. db.col.update(, , false, true)
  4. Visit 1.

This eliminates your edge situation and given MongoDB's atomic updates is totally safe.