blobs

Preventing azure blob from being accessed by other service while it's being created

时光总嘲笑我的痴心妄想 提交于 2020-01-05 10:33:28
问题 Azure blob API sometimes looks like it's been designed by aliens. Like designed for some very exotic use cases when most simple ones require jumping through the hoops. Here is one such. I have two worker roles. One is creating blobs, another one is processing them (and moving to "completed" folder when processing is done). The blob size can be moderately big, like 100 MB. Obviously, I don't want second role to start reading the blob before the blob has all the data. Okay, one can expect the

Preventing azure blob from being accessed by other service while it's being created

蓝咒 提交于 2020-01-05 10:33:00
问题 Azure blob API sometimes looks like it's been designed by aliens. Like designed for some very exotic use cases when most simple ones require jumping through the hoops. Here is one such. I have two worker roles. One is creating blobs, another one is processing them (and moving to "completed" folder when processing is done). The blob size can be moderately big, like 100 MB. Obviously, I don't want second role to start reading the blob before the blob has all the data. Okay, one can expect the

python opencv - blob detection or circle detection

社会主义新天地 提交于 2019-12-29 19:00:45
问题 I am having problems detecting circle areas. I tried it with the HoughCircles function from opencv. However even though the images are pretty similar the parameters for the funtion have to be different in order to detect the cirles. Another approach I tried was to iterate over every pixel and check if the current pixel is white. If this is the case then check if there is a blob object in the area (distance to blob center smaller than a threshold). If there is, append the pixel to the blob, if

Blob data from Oracle to text file using python

天涯浪子 提交于 2019-12-24 08:35:27
问题 I have been trying to get the blob data from oracle into a text file using Python. I couldn't find the answer on any of the other links. Below is my code : sql_string = """select event_id ,blob_length ,blob field from table""" cur.execute(sql_string) path = "P:/Folders/" for row in cur: filename = path + "notes_" + str(row[0]) + "_" + str(row[1]) + ".txt" f = codecs.open(filename, encoding='utf-8', mode='wb+') f.write(row[2]) f.close() I get the below error TypeError: utf_8_encode() argument

Persisting Blob Streams with NHibernate

假装没事ソ 提交于 2019-12-20 14:43:00
问题 If I have a class declared as: public class MyPersistentClass { public int ID { get; set; } public Stream MyData {get;set; } } How can I use NHibernate's mappings to persist the MyData property to and from the database? 回答1: You could use a Stream using a custom type and map it according to your storage needs. But there are some issues with using the Stream object as I mention in my blog series about lazy streaming of BLOBs and CLOBs with NHibernate. What you really need is a Blob object that

Persisting Blob Streams with NHibernate

北战南征 提交于 2019-12-20 14:41:04
问题 If I have a class declared as: public class MyPersistentClass { public int ID { get; set; } public Stream MyData {get;set; } } How can I use NHibernate's mappings to persist the MyData property to and from the database? 回答1: You could use a Stream using a custom type and map it according to your storage needs. But there are some issues with using the Stream object as I mention in my blog series about lazy streaming of BLOBs and CLOBs with NHibernate. What you really need is a Blob object that

How to replace the deprecated BlobBuilder with the new Blob constructor?

家住魔仙堡 提交于 2019-12-18 05:02:44
问题 Since Blobbuilder is deprecated and I have recently decided to use a new facial recognition API I am having a hard time switching over to just "blob". function dataURItoBlob(dataURI, callback) { // convert base64 to raw binary data held in a string // doesn't handle URLEncoded DataURIs var byteString; if (dataURI.split(',')[0].indexOf('base64') >= 0) { byteString = atob(dataURI.split(',')[1]); } else { byteString = unescape(dataURI.split(',')[1]); } // separate out the mime component var

Is it possible to use ViBe algorithm, implemented in opencv, for systema without GPU?

拜拜、爱过 提交于 2019-12-18 03:48:12
问题 I want to test ViBe algorithm for Background Subtraction. Currently I am using opencv libraries. I found out a sample implementation in opencv/samples/gpu/bgfg_segm.cpp and bgfg_vibe.cpp files. These files are under gpu module. Now I have a system without GPU. And when I try to run the code, it crashes on the initialization of the first frame. Can anybody tell me how to solve this issue? Thanks in advance. 回答1: pseudo codes suck big time! here's the un-pseudo/ed version. results?:there is

How to get hold of all the blobs in a Blob container which has sub directories levels(n levels)?

匆匆过客 提交于 2019-12-17 16:46:20
问题 Tried using the ListBlobsSegmentedAsync method , but this returns only the blobs from the main parent directory level .. But I need the entire list of blobs at one go from all the n levels of subdirectories. BlobContinuationToken continuationToken = null; bool useFlatBlobListing = true; BlobListingDetails blobListingDetails = BlobListingDetails.None; int maxBlobsPerRequest = 500; var blobOptions = new BlobRequestOptions (true ); do { var listingResult = await cbDir.ListBlobsSegmentedAsync

Oracle BLOB Extraction Very Slow

泪湿孤枕 提交于 2019-12-13 14:02:00
问题 I am having performance issues when extracting BLOBS from an oracle 10gR2 10.2.05 database I administer. I have around 400 files stored as BLOBS that I need to write out to the file system. Below is my code. When I execute this procedure the first 8 or so files are written within a couple of seconds and from there things slow down exponentially, somewhere around 1 file every 40 seconds after the first 8. To me this doesn't make any sense, why would the first 8 files be fast but after that