mongodb-query

MongoDB: how to compare $size of array to another document item?

二次信任 提交于 2019-12-22 10:20:05
问题 MongoDB: how to do something like this, both in mongo console and via JavaScript, Node.js: db.turnys.find( { users:{$size:seats } } ) turnydb.turnys looks like this: [ { "gId": "5335e4a7b8cf51bcd054b423", "seats": 2, "start": "2014-03-30T14:23:29.688Z", "end": "2014-03-30T14:25:29.688Z", "rMin": 800, "rMax": 900, "users": [], "_id": "533828e1d1a483b7fd8a707a" }, { "gId": "5335e4a7b8cf51bcd054b423", "seats": 2, "start": "2014-03-30T14:23:29.688Z", "end": "2014-03-30T14:25:29.688Z", "rMin": 900

Updating a deep record in MongoDb

杀马特。学长 韩版系。学妹 提交于 2019-12-22 08:25:48
问题 I have this record in MongoDb, and am using the native API: { "_children" : { "addressesR" : [ { "id" : ObjectId("530eea01071bd1a53065c1a6"), "personId" : ObjectId("530eea01071bd1a53065c1a4"), "street" : "ivermey", "city" : "perth", "_children" : { } }, { "_children" : { "configId" : { "a" : { "_children" : [ { "b" : 10 }, { "b" : 20 } ] } } }, "city" : "perth", "configId" : ObjectId("530eea01071bd1a53065c1a3"), "id" : ObjectId("530eea01071bd1a53065c1a5"), "personId" : ObjectId(

Update field type in mongo

此生再无相见时 提交于 2019-12-22 08:12:41
问题 I have a huge number of records in a collection : {field: [value]} How can I efficiently update to: {field: value} I've tried something like this: (pymongo syntax) collection.update({"field.1": {"$exists": True}}, {"$set": {'field': "field.1"}}, multi=True) which does not work apparently. Running through each record in a loop and removing-inserting is not an option because of the large number of records. 回答1: You need to loop over the cursor and update each document using the $set update

Mongodb aggregation pipeline size and speed issue

淺唱寂寞╮ 提交于 2019-12-22 08:10:05
问题 I'm trying to use a mongodb aggregation query to join($lookup) two collections and then distinct count all the unique values in the joined array. *Note: I don't necessarily know what fields(keys) are in the metaDataMap array. And I don't want to count or include fields that might or might not exist in the Map. So that's why the aggregation query looks like it does. So my two collections look like this: events- { "_id" : "1", "name" : "event1", "objectsIds" : [ "1", "2", "3" ], } Objects { "

Loop through Mongo Collection and update a field in every document

孤人 提交于 2019-12-22 08:03:10
问题 I have Dates in one Collection that were inserted incorrectly, and are in a simple "2015-09-10" string format. I'd like to update them to correct ISO Date format . I've tried looping through Mongo with forEach() but I don't know the shell well enough on how to update each document in the collection. So far I'm at this point: db.getCollection('schedules').find({}).forEach(function (doc) { doc.time = new Date( doc.time ).toUTCString(); printjson( doc.time ); // ^ This just prints "Invalid Date"

Project an element returned with “$arrayElemAt”

房东的猫 提交于 2019-12-22 06:47:04
问题 Forgive me if I confuse some terminology here, but I'm performing a join operation in an aggregation using the '$lookup' operator as shown here: db.collection('items').aggregate([{$match: {}}, { $lookup: { from: 'usr', localField: 'usr._id', foreignField: '_id', as: '__usr' } }, { $project: { info: 1, timestamp: 1, usr: { "$arrayElemAt": [ "$__usr", 0 ] } } }], (err, result) => { res.json(result); db.close(); }); I'm performing a projection on the aggregation result, and I'm using '

How to store unsigned long long (uint64_t) values in a MongoDB document?

冷暖自知 提交于 2019-12-22 05:23:28
问题 I want to store numbers of type unsigned long long (uint64_t) in a MongoDB document, how do I do it? I need to use unsigned long long because I'm using Twitter API which uses unsigned 64 bit integers https://dev.twitter.com/docs/twitter-ids-json-and-snowflake The range of the the unsigned 64 bit integral type needs to be represended by 8 bytes and with a data range of 0 to 18,446,744,073,709,551,615. I'm using the C++ MongoDB driver and the append member function of the BSONArrayBuilder class

MongoDB case insensitive index “starts with” performance problems

时光总嘲笑我的痴心妄想 提交于 2019-12-22 04:53:10
问题 After finding out that 3.3.11 supports case insensitive index (using collation) I have rebuilt my database of 40 million records to play with this. Alternative was to add e.g. lowercase fields specific to case insensitive search and index those. What I did was to ask MongoDB to support collation on my collection at the time of creation as suggested here. So I did this to enable case insensitivity for the entire collection: db.createCollection("users", {collation:{locale:"en",strength:1}})

MongoDB unwind multiple arrays

与世无争的帅哥 提交于 2019-12-22 04:18:18
问题 In mongodb there are documents in the following structure: { "_id" : ObjectId("52d017d4b60fb046cdaf4851"), "dates" : [ 1399518702000, 1399126333000, 1399209192000, 1399027545000 ], "dress_number" : "4", "name" : "J. Evans", "numbers" : [ "5982", "5983", "5984", "5985" ] } Is it possible unwind data from multiple arrays and get only paired elements from arrays: { "dates": "1399518702000", "numbers": "5982" }, { "dates": "1399126333000", "numbers": "5983" }, { "dates": "1399209192000", "numbers

MongoDB: Query a key having space in its name

眉间皱痕 提交于 2019-12-22 03:49:06
问题 I want to retrieve values of only certain keys from a MongoDB collection. But, the collection has some keys which have a 'space' in their name like: "Parent":{"key1": //some string, "key2": //some string, "key 3": //some string} I know this is a wrong approach as there shouldn't ideally be spaces in a key name but nevertheless how do I query this key? I am using Python and PyMongo. For normal keys I can do this: db.coll_name.find({"key": "India"}, {"_id": 0, "Parent.key1": 1, "Parent.key2": 1