azure-cosmosdb-sqlapi

CosmosDB C# SDK ProyUrl missing

拈花ヽ惹草 提交于 2019-12-11 19:37:09
问题 I am working on a containerized micro-services project (docker) in .NET Core targeting Azure. So we are using Azure Cosmos DB, and the C# Cosmos DB SDK (v2.4 since v3 is only in preview) for the CoreSql Api. During development, I am behind a proxy, so I need to specify the proxy url for everything going outside. It works fine with HttpClientHandler for any HttpClient. Using Cosmos DB SDK, I however don't see how to set this up on my DocumentClient. In the Azure Node SDK, I see I could just

Azure CosmosDB Continuation Token Structure

匆匆过客 提交于 2019-12-11 13:05:24
问题 I read a lot of the documentation regarding to CosmosDB paging and thought the token should look something like this: {\"token\":\"xxxxxx\",\"range\":{\"min\":\"xxxxxxxxxx\",\"max\":\"xxxxxxxxxx\"}} But I got a token looks like this: [{\"compositeToken\":{\"token\":\"xxxxxxxxx\",\"range\":{\"min\":\"\",\"max\":\"05C1B9CD673390\"}},\"orderByItems\":[{\"item\":24}],\"rid\":\"duJVAIns+3N6AAAAAAAAAA==\",\"skipCount\":0,\"filter\":null}] I was wondering in what scenario would the token has

How do I get UNIQUE categories from all documents in CosmosDB?

做~自己de王妃 提交于 2019-12-11 11:26:57
问题 I have millions of documents in CosmosDB using SQL API, and I need to find the unique categories from all documents. The documents looks like follows, you can see the categories array just under the description, I dont care in what order they are I just need to know all the unique ones from all documents in the collection, I need this so that later on I can create queries on the categories but thats a later question I first need to get them all out so I know what all the possible options are,

Adding a CosmosDB entity, (Unable to resolve iD for entity of type Tenant)

*爱你&永不变心* 提交于 2019-12-11 06:51:07
问题 I am trying to add a cosmosdb document using the following package: https://github.com/Elfocrash/Cosmonaut The api controller is this: [HttpPut] public async Task<IHttpActionResult> PutTenant([ModelBinder(typeof(TenantModelBinder))] Tenant tenant) { //var provider = new MultipartMemoryStreamProvider(); //var contentType = ""; //var content = new byte[0]; //await base.Request.Content.ReadAsMultipartAsync(provider); //if (provider.Contents.Count > 0) //{ // contentType = provider.Contents[0]

Cosmos DB out of Memory exception while executing stored procedure

六月ゝ 毕业季﹏ 提交于 2019-12-10 12:06:36
问题 I am using Azure Cosmos DB SQL API . I have written stored procedure will get the data and keeps in response API feed . Failed to execute stored procedure testProcedure for collection iotcollection: {"code":400,"body":"{\"code\":\"BadRequest\",\"message\":\"Message: {\\"Errors\\":[\\"Encountered exception while executing function. Exception = Error: Out of memory\\r\\nStack trace: undefined\\"]}\r\nActivityId: c286cbb6-34c1-4929-a148-915544b20ce6, Request URI: /apps/59d3b9ef-17ca-4bbf-8a11

NULL Continuation Token in CosmosDB

时光总嘲笑我的痴心妄想 提交于 2019-12-10 11:55:11
问题 I know that if the continuation token is null (the whole token json is null), that means there will be no more data in the next request . In my case, I checked that there is definitely no more data in the next page, and the token is supposed to be null. But it returns this "half" null token... And the token looks something like this: "{\"token\":null,\"range\":{\"min\":\"xxxxxxxxxx\",\"max\":\"xxxxxxxxxx\"}}" Only the token is null, but min and max are not null, what does it indicate? In

Cosmos DB Query Works in Data Explorer But Not Node.js

为君一笑 提交于 2019-12-08 12:45:53
问题 I am trying to run the following query against my cosmos db using Node.js. const querySpec = { query: "SELECT * FROM Users u WHERE u.id = @email", parameters: [ { name: "@email", value: "testuser@gmail.com" } ] }; const { result: results } = client.database(databaseId).container(containerId).items.query(querySpec).toArray(); if (results.length == 0) { throw "No matching user"; } else if (results.length > 1) { throw "Account found"; } const user = results[0]; console.log(user); however I keep

Why am I seeing different index behaviour between 2 seemingly identical CosmosDb Collections

落花浮王杯 提交于 2019-12-04 07:48:39
I'm trying to debug a very strange discrepency between 2 seperate cosmos db collection that on face value are configured the same. We recently modified some code that executed the following query. OLD QUERY SELECT * FROM c WHERE c.ProductId = "CODE" AND c.PartitionKey = "Manufacturer-GUID" NEW QUERY SELECT * FROM c WHERE (c.ProductId = "CODE" OR ARRAY_CONTAINS(c.ProductIdentifiers, "CODE")) AND c.PartitionKey = "Manufacturer-GUID" The introduction of that Array_Contains call in the production environment has tanked the performance of this query from ~3 RU/s ==> ~6000 RU/s. But only in the

How to do distributed transaction cordination around SQL API and GraphDB in CosmosDB?

拥有回忆 提交于 2019-12-04 06:06:09
问题 I have a Customer container with items representing a single customer in SQL API (DocumentDB) in CosmosDB . I also have a Gremlin API (GraphDB) with the customers' shoppingcart data. Both these data are temporary/transient. The customer can choose clear shopping cart which will delete the temporary customer and the shoppingcart data. Currently I make separate calls, one to the SQL API (DocumentDB) and Gremlin API (GraphDB) which works but I want to do both as a transaction (ACID principle).

How to do distributed transaction cordination around SQL API and GraphDB in CosmosDB?

会有一股神秘感。 提交于 2019-12-02 08:30:10
I have a Customer container with items representing a single customer in SQL API (DocumentDB) in CosmosDB . I also have a Gremlin API (GraphDB) with the customers' shoppingcart data. Both these data are temporary/transient. The customer can choose clear shopping cart which will delete the temporary customer and the shoppingcart data. Currently I make separate calls, one to the SQL API (DocumentDB) and Gremlin API (GraphDB) which works but I want to do both as a transaction (ACID principle). To delete a customer , I call the Gremblin API and delete the shoppingcart data, then call the SQL API