mongoimport

Convert to date MongoDB via mongoimport

北战南征 提交于 2019-12-03 13:57:41
I have downloaded huge chunks of data in the format in csv. I am using mongoimport to enter the data into MongoDB for processing. How do I get the date into date format recognized by MongoDB? sample data with header Date, Open Price, High Price, Low Price, Last Traded Price , Close Price, Total Traded Quantity, Turnover (in Lakhs) 04-Apr-2014,901,912,889.5,896.75,892.85,207149,1867.08 03-Apr-2014,908,918,897.65,900,900.75,156260,1419.9 02-Apr-2014,916,921.85,898,900.7,900.75,175990,1591.97 Salvador Dali As far as I know, there is no way to do this with mongoimport . But this is achievable by

MongoDB Bulk import using mongoimport from Windows folder

旧巷老猫 提交于 2019-12-02 21:07:28
I have a lot of json files in archive and i need to import them into mongo per one operation (i think that it might be in cycle). Have you any ideas about this? Sumeet If you are in a Linux/Unix shell you can try for filename in *; do mongoimport -d mydb -c $filename; done If you are on Windows: FOR %i IN (C:\mongodbData\*.json) DO mongoimport --db dbName --collection colection --type json --file %i You need to use mongorestore for recovery from dump, created by the mongoexport http://docs.mongodb.org/v2.6/reference/program/mongorestore/ for example mongorestore --drop --oplogReplay mongodb/

Which MongoDB types are not preserved by mongoimport/mongoexport?

若如初见. 提交于 2019-12-01 11:52:16
问题 The documentation for mongoexport has this scary warning, Avoid using mongoimport and mongoexport for full instance production backups. They do not reliably preserve all rich BSON data types, because JSON can only represent a subset of the types supported by BSON. Use mongodump and mongorestore as described in MongoDB Backup Methods for this kind of functionality. The page then goes on to say, To preserve type information, mongoexport and mongoimport uses the strict mode representation for

Using mongoimport to read CSV into nested structure?

一个人想着一个人 提交于 2019-12-01 04:38:11
I have a mongo document with a structure like: { "foo": { "bar1": "val1", "bar2": "val2"} } I'd like to import my data from a csv using mongoimport --type csv --headerline [...] I am not sure how to format the field name in the csv to address the nested structure. For instance: test.csv: foo.bar1 example returns { "_id" : ObjectId("4e9d9d25c5d8708e1f51cdbc"), "foo.bar1" : "example" } instead of the desired output: { "_id" : ObjectId("4e9d9d25c5d8708e1f51cdbc"), "foo: {"bar1" : "example"} } The field name seems to be interpreted as a string regardless of its value. Things like foo[bar1] and foo

Mongoimport csv files with string _id and upsert

a 夏天 提交于 2019-12-01 04:05:08
问题 I'm trying to use mongoimport to upsert data with string values in _id. Since the ids look like integers (even though they're in quotes), mongoimport treats them as integers and creates new records instead of upserting the existing records. Command I'm running: mongoimport --host localhost --db database --collection my_collection --type csv --file mydata.csv --headerline --upsert Example data in mydata.csv: { "_id" : "0364", someField: "value" } The result would be for mongo to insert a

MongoDb: How to import dump data from .gz file?

故事扮演 提交于 2019-12-01 04:02:19
I want to import dump data from my .gz file. Location of file is home/Alex/Documents/Abc/dump.gz and the name of db is "Alex" . I have tried mongorestore --gzip --db "Alex" /home/Alex/Documents/Abc/dump.gz But it shows error: 2018-10-31T12:54:58.359+0530 the --db and --collection args should only be used when restoring from a BSON file. Other uses are deprecated and will not exist in the future; use --nsInclude instead 2018-10-31T12:54:58.359+0530 Failed: file /home/Alex/Documents/Abc/dump.gz does not have .bson extension. How can I import it? Dump command: mongodump --host localhost:27017 -

Using mongoimport to read CSV into nested structure?

孤人 提交于 2019-12-01 02:37:46
问题 I have a mongo document with a structure like: { "foo": { "bar1": "val1", "bar2": "val2"} } I'd like to import my data from a csv using mongoimport --type csv --headerline [...] I am not sure how to format the field name in the csv to address the nested structure. For instance: test.csv: foo.bar1 example returns { "_id" : ObjectId("4e9d9d25c5d8708e1f51cdbc"), "foo.bar1" : "example" } instead of the desired output: { "_id" : ObjectId("4e9d9d25c5d8708e1f51cdbc"), "foo: {"bar1" : "example"} }

MongoDb: How to import dump data from .gz file?

不打扰是莪最后的温柔 提交于 2019-12-01 01:21:38
问题 I want to import dump data from my .gz file. Location of file is home/Alex/Documents/Abc/dump.gz and the name of db is "Alex" . I have tried mongorestore --gzip --db "Alex" /home/Alex/Documents/Abc/dump.gz But it shows error: 2018-10-31T12:54:58.359+0530 the --db and --collection args should only be used when restoring from a BSON file. Other uses are deprecated and will not exist in the future; use --nsInclude instead 2018-10-31T12:54:58.359+0530 Failed: file /home/Alex/Documents/Abc/dump.gz

Importing Date-datatype using mongoimport

≯℡__Kan透↙ 提交于 2019-11-28 07:00:59
I have many GB of data stored in PostgreSQL database and i need those to be imported into the MongoDB. I did this using CSV export and mongoimport. There are columns like this '2011-06-25' in that CSV and it has been imported as string, not as MongoDate, so i cannot effectively search by date. I've found this : http://www.mongodb.org/display/DOCS/Import+Export+Tools#ImportExportTools-Example%3AImportingInterestingTypes but the example says, i need to use JSON structure for the file. Do i really need to export JSON file from PostgreSQL? If i do - how? If i don't, how to export "MongoDate"

Importing JSON file using mongimport, keep getting `unexpected identifier`?

寵の児 提交于 2019-11-27 11:34:49
问题 I'm trying to add a JSON file to mongodb using mongoimports from terminal, here: mongoimport --db my_db --collection my_collection --file /content/2_read.json I keep getting JavaScript execution failed: SyntaxError: Unexpected identifier I ran my JSON through JSON Lint: http://jsonlint.com/ which says it's valid JSON. I'm not sure what could be tripping up the import process?? Or how to investigate further to hunt down the issue? UPDATE Somebody suggested putting all on one-line. A decent