aggregation-framework

Find last record of each day

孤街醉人 提交于 2019-12-29 01:19:16
问题 I use mongodb (I'm new in mongodb) to store data about my power consumption , each minute there is a new record, here is an example : {"id":"5309d4cae4b0fbd904cc00e1","adco":"O","hchc":7267599,"hchp":10805900,"hhphc":"g","ptec":"c","iinst":13,"papp":3010,"imax":58,"optarif":"s","isousc":60,"motdetat":"Á","date":1393156826114} so I have around 1440 records a day. I want to compute the cost by day, but the problem for this is that I need the last record of the day, because this record can give

Mongodb concat int and string

我与影子孤独终老i 提交于 2019-12-28 16:37:37
问题 I'm trying to project FileName and FileSize for all my files in my collection with a size of 50 mb and greater, but I cannot concat a the type FileSize as it has a type of Int I want the projection to be { "result" : [ { "_id" : ObjectId("5652c399a21dad0bb01b6308"), "FileName" : "1234567890.xml", "FileSize" : "11.06 MB" }, { "_id" : ObjectId("5652c399a21dad0bb01b630f"), "FileName" : "2468101214.xml", "FileSize" : "320.48 MB" }, { "_id" : ObjectId("5652c399a21dad0bb01b631f"), "FileName" :

Group by Date with Local Time Zone in MongoDB

有些话、适合烂在心里 提交于 2019-12-28 04:26:06
问题 I am new to mongodb. Below is my query. Model.aggregate() .match({ 'activationId': activationId, "t": { "$gte": new Date(fromTime), "$lt": new Date(toTime) } }) .group({ '_id': { 'date': { $dateToString: { format: "%Y-%m-%d %H", date: "$datefield" } } }, uniqueCount: { $addToSet: "$mac" } }) .project({ "date": 1, "month": 1, "hour": 1, uniqueMacCount: { $size: "$uniqueCount" } }) .exec() .then(function (docs) { return docs; }); The issue is mongodb stores date in iso timezone. I need this

Mongodb aggregation pipeline how to limit a group push

故事扮演 提交于 2019-12-28 03:05:50
问题 I am not able to limit the amount of pushed elements in a group function with aggregation pipeline. Is this possible? Small example: Data: [ { "submitted": date, "loc": { "lng": 13.739251, "lat": 51.049893 }, "name": "first", "preview": "my first" }, { "submitted": date, "loc": { "lng": 13.639241, "lat": 51.149883 }, "name": "second", "preview": "my second" }, { "submitted": date, "loc": { "lng": 13.715422, "lat": 51.056384 }, "name": "nearpoint2", "preview": "my nearpoint2" } ] Here is my

using $slice operator to get last element of array in mongodb

只愿长相守 提交于 2019-12-28 02:07:08
问题 How to get last element of array with conditions in mongodb. I am unable to use slice. Here is my code { "1" : { "relevancy" : [ "Y" ] }, "_id" : ObjectId("530824b95f44eac1068b45c0") } { "1" : { "relevancy" : [ "Y", "Y" ] }, "_id" : ObjectId("530824b95f44eac1068b45c2") } { "1" : { "relevancy" : [ "N" ] }, "_id" : ObjectId("530824b95f44eac1068b45c3") } { "1" : { "relevancy" : [ "Y", "Y" ] }, "_id" : ObjectId("530824b95f44eac1068b45c4") } { "1" : { "relevancy" : [ "Y", "N" ] }, "_id" : ObjectId

exception: can't convert from BSON type EOO to Date

a 夏天 提交于 2019-12-27 17:39:29
问题 I am getting an issue for running the following aggregate query: db.snippets.aggregate([ { '$project': { month: { '$month': '$created_at' }} } ]) The error message for the same is: assert: command failed: { "errmsg" : "exception: can't convert from BSON type EOO to Date", "code" : 16006, "ok" : 0 } : aggregate failed How do i get around this issue. I found a related question Related Stack Overflow question But it doesnt tell how to get things done. 回答1: You likely have one or more docs with a

Get a count of total documents with MongoDB when using limit

99封情书 提交于 2019-12-27 11:04:58
问题 I am interested in optimizing a "pagination" solution I'm working on with MongoDB. My problem is straight forward. I usually limit the number of documents returned using the limit() functionality. This forces me to issue a redundant query without the limit() function in order for me to also capture the total number of documents in the query so I can pass to that to the client letting them know they'll have to issue an additional request(s) to retrieve the rest of the documents. Is there a way

Mongodb Aggregation command to java code

走远了吗. 提交于 2019-12-25 18:35:44
问题 Convert shell command to Java code, Hi what iam trying to do is, GROUP collection by "sourceSystemName" and get values of logID,type,_id,sourceSystemName,logTime for MAX "logTime" Collection Sample Data:(contains 1million data) { "logID" : "1487408645950", "logTime" : ISODate("2017-02-6T06:47:59Z"), "type" : "SYSTEM_MONITOR", "sourceSystemId" :"192.168.1.226", "sourceSystemName" : "LOADER.LOG" } { "logID" : "1488226732268", "logTime" : ISODate("2017-02-16T06:48:00Z"),"type" : "SYSTEM_MONITOR"

Got duplicated data when subscribe multiple times

爱⌒轻易说出口 提交于 2019-12-25 17:04:35
问题 I am using MongoDB aggregation in meteor. I got duplicated data when subscribe multiple times. (The data in database are static, which means they are same all the time.) // Server side Meteor.publish('totalNumber', function () { let pipeline = [ { $unwind: '$product' }, { $group: { _id: { code: '$product.code', hour: { $hour: '$timestamp' } }, total: { $sum: '$product.count' }, }} ]; Products.aggregate( pipeline, Meteor.bindEnvironment((err, result) => { console.log('result', result); // at

Got duplicated data when subscribe multiple times

痞子三分冷 提交于 2019-12-25 17:03:18
问题 I am using MongoDB aggregation in meteor. I got duplicated data when subscribe multiple times. (The data in database are static, which means they are same all the time.) // Server side Meteor.publish('totalNumber', function () { let pipeline = [ { $unwind: '$product' }, { $group: { _id: { code: '$product.code', hour: { $hour: '$timestamp' } }, total: { $sum: '$product.count' }, }} ]; Products.aggregate( pipeline, Meteor.bindEnvironment((err, result) => { console.log('result', result); // at