azure-blob-storage

MarkLogic Cluster - Configure Forest with all documents

喜你入骨 提交于 2019-12-11 20:08:45
问题 We are working on MarkLogic 9.0.8.2 We are setting up MarkLogic Cluster (3 VMs) on Azure and as per failover design, want to have 3 forests (each for Node) in Azure Blob. I am done with Setup and when started ingestion, i found that documents are distributed across 3 forests and not stored all in each Forest. For e.g. i ingested 30000 records and each forest contains 10000 records. What i need is to have all forest with 30000 records. Is there any configuration (at DB or forest level) i need

what is the best way to get azure blob storage

前提是你 提交于 2019-12-11 17:59:02
问题 I'm working with scala and spark and need to access azure blob storage and get its list of files. What is the best way to do that knowing spark version is 2.11. 回答1: For Spark running on local, there is an official blog which introduces how to access Azure Blob Storage from Spark. The key is that you need to configure Azure Storage account as HDFS-compatible storage in core-site.xml file and add two jars hadoop-azure & azure-storage to your classpath for accessing HDFS via the protocol wasb[s

Download file from AZURE BLOB CONTAINER using SAS URI in PYTHON

时间秒杀一切 提交于 2019-12-11 15:59:05
问题 I have Azure container where i keep some files. I need to access them using python code I did same thing in JAVA but i am unable to replicate it in Python //This is java code for same. CloudBlobContainer Con = new CloudBlobContainer("Some SAS URI"); CloudBlockBlob blob1 = Con.getBlockBlobReference(fileName); blob1.downloadToFile(filePath+fileName+userName); 回答1: There is no equivalent method in python, you can take a look at the Container class of python You should always use BlockBlobService

Databricks read Azure blob last modified date

假装没事ソ 提交于 2019-12-11 09:55:14
问题 I have an Azure blob storage mounted to my Databricks hdfs. Is there a way to get the last modified date of the blob in databricks? This is how i'm reading the blob content: val df = spark.read .option("header", "false") .option("inferSchema", "false") .option("delimiter", ",") .csv("/mnt/test/*") 回答1: Generally, there are two ways to read an Azure Blob last modified data, as below. Directly read it via Azure Storage REST API or Azure Storage SDK for Java. After I researched Azure Blob

Nodejs upload base64 image to azure blob storage using .createBlockBlobFromLocalFile()

南楼画角 提交于 2019-12-11 07:57:32
问题 I want to upload profile picture of a user sent from web app and mobile app via Base64 form. On the POST request they need to send a JSON on the body that looks something like this. { "name":"profile-pic-123.jpg", "file":"data:image/jpeg;base64,/9j/4AAQSkZJRgABAQAAAQABAAD/2wCEAAkGBxQTEhIUEhIUFBUV…K9rk8hCAEkjFMUYiEAI+nHIpsQh0AkisDYRTOiCAbWVtgCtI6IlkHh7LDTQXLH0EIQBj//2Q==" // the base64 image } Now on the server side using Node and Express , I used this npm module called azure-storage which

Get all files from azure blob container using rest API, I am already tried this code but i am facing 403 error

旧时模样 提交于 2019-12-11 07:29:49
问题 I already get containers using RestAPI Get all files from the blob storage container using RestAPI. I already use this API. private const string ListofFilesURL = "https://{0}.blob.core.windows.net/{1}?restype=container&comp=list&maxresults=10"; My Code is below, public async void ListofFilessinBlob(string containername) { string Requesturl = string.Format(ListofFilesURL, storageAccount, containername); HttpWebRequest request = (HttpWebRequest)WebRequest.Create(Requesturl); string now =

Azure functions python no value for named parameter

老子叫甜甜 提交于 2019-12-11 05:09:10
问题 I am currently using python in azure functions to create a timer trigger that aggregates data from blob storage, and puts the result in cosmosDB. My problem is as follows: When I use a specific file in the path binding the functions runs as expected. Whenever I change it (so as to take all blobs in the container) I get the following error: Microsoft.Azure.WebJobs.Host: No value for named parameter 'test'. Below is my function.json bindings { "bindings": [ { "name": "blobTrigger", "type":

MarkLogic Failover Cluster on Azure - Forest configuration on Azure Blob

馋奶兔 提交于 2019-12-11 04:46:41
问题 As per MarkLogic cluster recommendation, we need to configure it as per below link MarkLogic Cluster - Configure Forest with all documents Forest configuration is done as per MarkLogic on Azure Guide Page No. 28 i.e. Azure storage key has been set in Security -> Credentials -> Azure Data directory has been set as azure:// This is working fine and every forest on cluster host has been set in a different container within same azure Blob. Now i want to configure failover cluster by replicating

How to copy data to VM from blob storage?

安稳与你 提交于 2019-12-11 04:19:25
问题 Is it possible to copy file(s) which are present in Azure blob storage to Azure virtual machine? After exploring Azure Data Factory documentation, it seems like Data management Gateway provide 'File System' as a sink for data but I am not able to find any documentation/tutorial for it. Can anyone please tell if it is possible? If yes, how it can be done? Adding more details about the original task. We have one Windows Application which is supposed to access blob storage and fetch the input

Using Azure Blob Storage with java MVC Azure Web Site

非 Y 不嫁゛ 提交于 2019-12-11 02:04:11
问题 I have created a java MVC web app and deployed on the Azure cloud. Now I am trying to capture my web application logs into the text/CSV file and store that text/CSV file in Azure Blob Storage. Can anyone tell me how to do this? How to access Azure Blob Storage. I went through this article but was not of much help. Please anyone help. Note- In on premises application we can do the same using properties file & log4j jar. I want to do the same in Azure web App. 回答1: Based on my understanding, I