i got some records from server with time interval of 10 minute (in 1 hours i will get 6 files) i want to do map reduce on every 1 hours in next hours i will have to do map reduce of next group on 6 files with last hours file how i will solve this problem ? help me im confuse frm last 1 month thank You Sushil Kr Singh
-
Well, read about Map reduces: http://www.mongodb.org/display/DOCS/MapReduce and then you wanna call that MR from a scheduled job on either Linux, Mac or Windows which will run a script of your choice firing off the MR. As a tip you will need to hold a counter collection to mark when the MR was last run so you can get all records within a date range from then until now. Without further info and a better written question that's all I've really got. – Sammaye Sep 03 '12 at 19:50
1 Answers
In order to summarize your 10-minute log files by the hour, you could round down the timestamp of each logfile to the nearest hour in the map function and group the results by hours in the reduce function.
Here is a little dummy example that illustrates this from the mongo shell:
Create 100 log files, each 10 minutes apart and containing a random number between 0-10, and insert them in the
logs
collection in the database:for (var i = 0; i < 100; i++) { d = new ISODate(); d.setMinutes(d.getMinutes() + i*10); r = Math.floor(Math.random()*11) db.logs.insert({timestamp: d, number: r}) }
To check what the
logs
collection looks like, send a query likedb.logs.find().limit(3).pretty()
, which results in:{ "_id" : ObjectId("50455a3570537f9433c1efb2"), "timestamp" : ISODate("2012-09-04T01:32:37.370Z"), "number" : 2 } { "_id" : ObjectId("50455a3570537f9433c1efb3"), "timestamp" : ISODate("2012-09-04T01:42:37.370Z"), "number" : 3 } { "_id" : ObjectId("50455a3570537f9433c1efb4"), "timestamp" : ISODate("2012-09-04T01:52:37.370Z"), "number" : 8 }
Define a map function (in this example called
mapf
) that rounds the timestamps to the nearest hour (rounded down), which is used for the emit key. The emit value is the number for that log file.mapf = function () { // round down to nearest hour d = this.timestamp; d.setMinutes(0); d.setSeconds(0); d.setMilliseconds(0); emit(d, this.number); }
Define a reduce function, that sums over all the emitted values (i.e. numbers).
reducef = function (key, values) { var sum = 0; for (var v in values) { sum += values[v]; } return sum; }
Now execute map/reduce on the logs collection. The
out
parameter here specifies that we want to write the results to thehourly_logs
collection and merge existing documents with new results. This ensures that log files submitted later (e.g. after a server failure or other delay) will be included in the results once they appear in the logs.db.logs.mapReduce(mapf, reducef, {out: { merge : "hourly_logs" }})
Lastly, to see the results, you can query a simple find on
hourly_logs
:db.hourly_logs.find() { "_id" : ISODate("2012-09-04T02:00:00Z"), "value" : 33 } { "_id" : ISODate("2012-09-04T03:00:00Z"), "value" : 31 } { "_id" : ISODate("2012-09-04T04:00:00Z"), "value" : 21 } { "_id" : ISODate("2012-09-04T05:00:00Z"), "value" : 40 } { "_id" : ISODate("2012-09-04T06:00:00Z"), "value" : 26 } { "_id" : ISODate("2012-09-04T07:00:00Z"), "value" : 26 } { "_id" : ISODate("2012-09-04T08:00:00Z"), "value" : 25 } { "_id" : ISODate("2012-09-04T09:00:00Z"), "value" : 46 } { "_id" : ISODate("2012-09-04T10:00:00Z"), "value" : 27 } { "_id" : ISODate("2012-09-04T11:00:00Z"), "value" : 42 } { "_id" : ISODate("2012-09-04T12:00:00Z"), "value" : 43 } { "_id" : ISODate("2012-09-04T13:00:00Z"), "value" : 35 } { "_id" : ISODate("2012-09-04T14:00:00Z"), "value" : 22 } { "_id" : ISODate("2012-09-04T15:00:00Z"), "value" : 34 } { "_id" : ISODate("2012-09-04T16:00:00Z"), "value" : 18 } { "_id" : ISODate("2012-09-04T01:00:00Z"), "value" : 13 } { "_id" : ISODate("2012-09-04T17:00:00Z"), "value" : 25 } { "_id" : ISODate("2012-09-04T18:00:00Z"), "value" : 7 }
The result is an hourly summary of your 10-minute logs, with the _id field containing the start of the hour and the value field the sum of the random numbers. In your case, you may have different aggregation operators; modify the reduce functions according to your needs.
As Sammaye mentioned in the comment, you could automate the map/reduce call with a cron job entry to run every hour.
If you don't want to process the full logs collection every time, you can run incremental updates by limiting the documents to hourly time windows like so:
var q = { $and: [ {timestamp: {$gte: new Date(2012, 8, 4, 12, 0, 0) }},
{timestamp: {$lt: new Date(2012, 8, 4, 13, 0, 0) }} ] }
db.logs.mapReduce(mapf, reducef, {query: q, out: { merge : "hourly_logs" }})
This would only include log files between the hours of 12 and 13. Note that the month value in the Date() object starts at 0 (8=September). Because of the merge
option, it is safe to run the m/r on already processed log files.

- 2,608
- 1
- 13
- 10