azure-cosmosdb

CosmosDb Request rate is large with insertMany

醉酒当歌 提交于 2020-01-06 06:54:44
问题 I have the following repository class inserting data in a CosmosDb database from a batch : public bool InsertZonierData(List<Zonier> zonierList) { if (zonierList == null || !zonierList.Any()) { throw new ZonierListNullOrEmptyException(); } else { try { _collection.InsertMany(zonierList); return true; } catch (MongoBulkWriteException ex) { throw new DataBaseWritingException(ex.Message, ExceptionCodeConstants.DataBaseWritingExceptionCode); } } } Unfortunately, having more than 30000 elements in

Importing Json using DocumentDB Data Migration tool gives “Error while fetching page of documents code: 400” in CosmosDB

北战南征 提交于 2020-01-06 06:43:09
问题 Hi I am trying to import a Json file created using mongoexport into CosmosDB. I am using the DocumentDB migration tool which passes successfully and imports the collection into cosmos. When i try looking at the collection through cosmos DB i get the error code Error while fetching page of documents: {"code":400,"body":"Command find failed: Unknown server error occurred when processing this request.."} I am expecting the document to be view-able through Cosmos DB but am not sure why i am

Node.js - Async - multiple innercallbacks within a if statement

孤街浪徒 提交于 2020-01-06 04:34:12
问题 I'm using Node.js and the async library, however I keep seeing the error: Callback was already called . I think I understand why I get the error, however I don't know if it is actually possible to do the following/how can resolve. Basically I want both the innercallbacks to have completed before the outercallback is completed. So the code with which I am facing this issue looks like: async.forEachLimit(inData, 25, function (data, innercallback) { myJson.matches.forEach(function (oMatches) {

Save File in DocumentDb with custom name

淺唱寂寞╮ 提交于 2020-01-05 11:04:29
问题 I'm trying to save a XML file in DocumentDb in Json format. I have no problem in converting it and saving. All the conversion and saving are working fine. However when I store my Xml file the DocumentDB provides its own file name to the file. Eg; 8756b9b9-41ac-47ca-9f4c-abec60b318be . But I want to save the file with my own custom name Eg; MyXmlFile1 or MyXmlFile2 etc; How do I pass my custom name when saving the file? That is MyXmlFile1 or MyXmlFile2. "jsontoStore" has the content of the

Save File in DocumentDb with custom name

ⅰ亾dé卋堺 提交于 2020-01-05 11:04:26
问题 I'm trying to save a XML file in DocumentDb in Json format. I have no problem in converting it and saving. All the conversion and saving are working fine. However when I store my Xml file the DocumentDB provides its own file name to the file. Eg; 8756b9b9-41ac-47ca-9f4c-abec60b318be . But I want to save the file with my own custom name Eg; MyXmlFile1 or MyXmlFile2 etc; How do I pass my custom name when saving the file? That is MyXmlFile1 or MyXmlFile2. "jsontoStore" has the content of the

Connect an HTML Drop Down List to Cosmos DB

帅比萌擦擦* 提交于 2020-01-05 08:26:17
问题 I am looking to connect an HTML form to a Cosmos document rather than having static data. Can anyone guide me in the right direction? This is what I have for static data, I have a local db set up for testing. Thanks for any assistance. var select = document.getElementById("ClosurePlanList"), arr = ["test1","test2","test3"]; for(var i = 0; i < arr.length; i++) { var option = document.createElement("OPTION"), txt = document.createTextNode(arr[i]); option.appendChild(txt); option.setAttribute(

Gremlin on Azure CosmosDB: how to project the related vertices' properties?

雨燕双飞 提交于 2020-01-05 07:30:52
问题 I use Microsoft.Azure.Graphs library to connect to a Cosmos DB instance and query the graph database. I'm trying to optimize my Gremlin queries in order to only select those properties that I only require. However, I don't know how to choose which properties to select from edges and vertices. Let's say we start from this query: gremlin> g.V().hasLabel('user'). project('user', 'edges', 'relatedVertices') .by() .by(bothE().fold()) .by(both().fold()) This will return something along the lines of

Determining size of a JSON document stored in DocumentDB

∥☆過路亽.° 提交于 2020-01-04 18:15:32
问题 I'm developing a partitioning strategy for a multi-tenant application running on DocumentDB. Since each collection only allows for 10gb of storage I am attempting to calculate how many documents each of my tenants can store, so I can come up with the number of tenants I can place into a collection. I have a sample Json document that represents a common document that a tenant may store. Using Document Explorer on the Azure Portal does not tell me what the size of one of these documents is on

Migrating a Cosmos DB fixed collection with partition key to an unlimited collection

余生颓废 提交于 2020-01-04 04:27:04
问题 I have a Cosmos DB Fixed Collection. The collection was created and utilizes a partition key. What are the migration options from this Fixed Collection to an Unlimited Collection? I know that I can use the Azure Cosmos DB Migration Tool to export data to JSON, then import it into a newly provisioned Unlimited Collection. Are there any other options supported by Microsoft? 回答1: The Azure team wrote a migration tool that uses Cosmos Change Feeds and the Change Feed Processor: https://github.com

DocumentDb write within a transactionscope

╄→尐↘猪︶ㄣ 提交于 2020-01-04 03:32:05
问题 I am trying to use a DocumentDb write as a part of a transaction like below - using (var scope = new TransactionScope) { //first transaction //write to document db //third transaction } I observed that if the third transaction fails, documentDb write is not rolled back and I still see the document in the collection. The first transaction (NEventStore in this case) rolls back perfectly. Does anyone know if DocumentDb supports TrnasactionScope. What if I have a nested transaction? Thanks! Edit: