azure-functions

Azure Function Read local.settings.json to object

懵懂的女人 提交于 2021-01-03 06:16:39
问题 I know i could add all my environment vars under the values {} section of local.settings.json. I am however trying to keep a tidy home and would like if I could do something like this. local.settings.json { "IsEncrypted": false, "Values": { "AzureWebJobsStorage": "UseDevelopmentStorage=true", "AzureWebJobsDashboard": "", "Hello": "world" }, "ClientConfiguration": { "this": "that", "SubscriberEndpoint": "", "Username": "", "Password": "", "ObjectEndpoint": "" } } in my code i have var config =

How to properly handle secrets in a local.settings.json file when adding the function source code to a source control repository

夙愿已清 提交于 2021-01-03 03:31:14
问题 I have an Azure function with a few secrets in its local.settings.json file. What are the best practices when I want to share the source code of my function in GitHub? So far I can think of the following options, but each option has some issues or challenges: 1- Remember to change the secrets in local.settings.json anytime I commit my changes. Once the commit is done, undo changes, so I can run the function and debug it. This option is very error-prone and tedious. 2- Add local.settings.json

How to properly handle secrets in a local.settings.json file when adding the function source code to a source control repository

坚强是说给别人听的谎言 提交于 2021-01-03 03:30:11
问题 I have an Azure function with a few secrets in its local.settings.json file. What are the best practices when I want to share the source code of my function in GitHub? So far I can think of the following options, but each option has some issues or challenges: 1- Remember to change the secrets in local.settings.json anytime I commit my changes. Once the commit is done, undo changes, so I can run the function and debug it. This option is very error-prone and tedious. 2- Add local.settings.json

Azure Function (Python) w/ Storage Upload Trigger Fails with Large File Uploads

人走茶凉 提交于 2021-01-02 00:37:14
问题 Azure Function (Python) triggered from file uploads to Azure Storage. Function works fine for files up to ~120MB. I just load tested with a 2GB file and the function produced the error Stream was too long. Where is this limitation documented? How would I overcome it using Python? Using boto3 library to PUT files to AWS S3 def main(myblob: func.InputStream): logging.info(f"Python blob trigger function processed blob \n" f"Name: {myblob.name}\n" f"Blob Size: {myblob.length} bytes") myblobBytes

Azure Function (Python) w/ Storage Upload Trigger Fails with Large File Uploads

僤鯓⒐⒋嵵緔 提交于 2021-01-02 00:35:29
问题 Azure Function (Python) triggered from file uploads to Azure Storage. Function works fine for files up to ~120MB. I just load tested with a 2GB file and the function produced the error Stream was too long. Where is this limitation documented? How would I overcome it using Python? Using boto3 library to PUT files to AWS S3 def main(myblob: func.InputStream): logging.info(f"Python blob trigger function processed blob \n" f"Name: {myblob.name}\n" f"Blob Size: {myblob.length} bytes") myblobBytes

Azure Function (Python) w/ Storage Upload Trigger Fails with Large File Uploads

帅比萌擦擦* 提交于 2021-01-02 00:33:08
问题 Azure Function (Python) triggered from file uploads to Azure Storage. Function works fine for files up to ~120MB. I just load tested with a 2GB file and the function produced the error Stream was too long. Where is this limitation documented? How would I overcome it using Python? Using boto3 library to PUT files to AWS S3 def main(myblob: func.InputStream): logging.info(f"Python blob trigger function processed blob \n" f"Name: {myblob.name}\n" f"Blob Size: {myblob.length} bytes") myblobBytes

Azure Function (Python) w/ Storage Upload Trigger Fails with Large File Uploads

我只是一个虾纸丫 提交于 2021-01-02 00:33:07
问题 Azure Function (Python) triggered from file uploads to Azure Storage. Function works fine for files up to ~120MB. I just load tested with a 2GB file and the function produced the error Stream was too long. Where is this limitation documented? How would I overcome it using Python? Using boto3 library to PUT files to AWS S3 def main(myblob: func.InputStream): logging.info(f"Python blob trigger function processed blob \n" f"Name: {myblob.name}\n" f"Blob Size: {myblob.length} bytes") myblobBytes

Azure Function (Python) w/ Storage Upload Trigger Fails with Large File Uploads

安稳与你 提交于 2021-01-02 00:32:08
问题 Azure Function (Python) triggered from file uploads to Azure Storage. Function works fine for files up to ~120MB. I just load tested with a 2GB file and the function produced the error Stream was too long. Where is this limitation documented? How would I overcome it using Python? Using boto3 library to PUT files to AWS S3 def main(myblob: func.InputStream): logging.info(f"Python blob trigger function processed blob \n" f"Name: {myblob.name}\n" f"Blob Size: {myblob.length} bytes") myblobBytes

Azure functions - should functions be written inside static classes

懵懂的女人 提交于 2020-12-30 05:35:55
问题 I'm starting to try out Azure functions. I'm using Visual Studio 2017 Preview version 15.3. When I right click on the Azure Functions project I created, and select Add>New Item...>Azure Function, the default template Visual Studio generates is of a public static class with a public static async Task method (the function). Does the class need to be static (I changed it to non-static and it seems to work)? Is that a best practice for Azure functions? If that is the case, what problems might

Azure functions: Unable to load DLL 'sni.dll' or one of its dependencies: The specified module could not be found. (0x8007007E)

扶醉桌前 提交于 2020-12-27 06:58:17
问题 Trying to run this Azure Function in Azure portal but fails with above title error: using System; using System.IO; using System.Net; using System.Threading.Tasks; using Microsoft.AspNetCore.Mvc; using Microsoft.Azure.WebJobs; using Microsoft.AspNetCore.Http; using Microsoft.Extensions.Logging; using Microsoft.Extensions.Primitives; using Newtonsoft.Json; using System.Data.SqlClient; public static string Run(HttpRequest req, ILogger log) { string name="dbconn"; string conStr = System