entropy

Calculating Entropy

强颜欢笑 提交于 2019-12-04 08:27:54
问题 I've tried for several hours to calculate the Entropy and I know I'm missing something. Hopefully someone here can give me an idea! EDIT: I think my formula is wrong! CODE: info <- function(CLASS.FREQ){ freq.class <- CLASS.FREQ info <- 0 for(i in 1:length(freq.class)){ if(freq.class[[i]] != 0){ # zero check in class entropy <- -sum(freq.class[[i]] * log2(freq.class[[i]])) #I calculate the entropy for each class i here }else{ entropy <- 0 } info <- info + entropy # sum up entropy from all

Weird output while finding entropy of frames of a video in opencv

我的梦境 提交于 2019-12-04 07:39:46
#include <cv.h> #include <highgui.h> #include <iostream> #include <cmath> #include <cstdlib> #include <fstream> using namespace std; typedef struct histBundle { double rCh[256]; double gCh[256]; double bCh[256]; }bundleForHist; bundleForHist getHistFromImage (IplImage* img, int numBins) { float range[] = { 0, numBins }; float *ranges[] = { range }; bundleForHist bfh; CvHistogram *hist = cvCreateHist (1, &numBins, CV_HIST_ARRAY, ranges, 1); cvClearHist (hist); IplImage* imgRed = cvCreateImage(cvGetSize(img), 8, 1); IplImage* imgGreen = cvCreateImage(cvGetSize(img), 8, 1); IplImage* imgBlue =

How (if at all) does a predictable random number generator get more secure after SHA-1ing its output?

折月煮酒 提交于 2019-12-04 03:39:52
This article states that Despite the fact that the Mersenne Twister is an extremely good pseudo-random number generator, it is not cryptographically secure by itself for a very simple reason. It is possible to determine all future states of the generator from the state the generator has at any given time, and either 624 32-bit outputs, or 19,937 one-bit outputs are sufficient to provide that state. Using a cryptographically-secure hash function, such as SHA-1, on the output of the Mersenne Twister has been recommended as one way of obtaining a keystream useful in cryptography. But there are no

PGP: Not enough random bytes available. Please do some other work to give the OS a chance to collect more entropy

别等时光非礼了梦想. 提交于 2019-12-03 18:38:04
问题 Setup : Ubuntu Server on Virtual Machine with 6 cores and 3GB of RAM. when I am trying to generate a asymmetric key pair via GPG like this gpg --gen-key . I get the following error : Not enough random bytes available. Please do some other work to give the OS a chance to collect more entropy! I tried to google a little bit. This is what I realise , I need to fire up another terminal and type in cat /udev/random --> It randomly generates a series of randomly generated values to increase the

Weighted Decision Trees using Entropy

孤街醉人 提交于 2019-12-03 13:01:41
I'm building a binary classification tree using mutual information gain as the splitting function. But since the training data is skewed toward a few classes, it is advisable to weight each training example by the inverse class frequency. How do I weight the training data? When calculating the probabilities to estimate the entropy, do I take weighted averages? EDIT: I'd like an expression for entropy with the weights. State-value weighted entropy as a measure of investment risk. http://www56.homepage.villanova.edu/david.nawrocki/State%20Weighted%20Entropy%20Nawrocki%20Harding.pdf Robert Harvey

sources of “uniqueness”/entropy on embedded systems

杀马特。学长 韩版系。学妹 提交于 2019-12-03 09:06:23
问题 I have an embedded system. What I would like for it to do when it powers up or otherwise resets, is to generate a unique ID, so that on different restarts a different unique ID is generated with high probability. It does not have access to a real-time clock, but it does have access to an ADC and a UART. I am wondering if there is a decent way to gather entropy from these sources to generate a unique ID. I am vaguely familiar with Yarrow. Is there a good way to use this? Unfortunately I do not

What is the computer science definition of entropy?

旧城冷巷雨未停 提交于 2019-12-03 00:19:08
问题 I've recently started a course on data compression at my university. However, I find the use of the term "entropy" as it applies to computer science rather ambiguous. As far as I can tell, it roughly translates to the "randomness" of a system or structure. What is the proper definition of computer science "entropy"? 回答1: Entropy can mean different things: Computing In computing, entropy is the randomness collected by an operating system or application for use in cryptography or other uses

Calculating entropy from GLCM of an image

岁酱吖の 提交于 2019-12-03 00:02:28
I am using skimage library for most of image analysis work. I have an RGB image and I intend to extract texture features like entropy , energy , homogeneity and contrast from the image. Below are the steps that I am performing: from skimage import io, color, feature from skimage.filters import rank rgbImg = io.imread(imgFlNm) grayImg = color.rgb2gray(rgbImg) print(grayImg.shape) # (667,1000), a 2 dimensional grayscale image glcm = feature.greycomatrix(grayImg, [1], [0, np.pi/4, np.pi/2, 3*np.pi/4]) print(glcm.shape) # (256, 256, 1, 4) rank.entropy(glcm, disk(5)) # throws an error since entropy

How good is SecRandomCopyBytes?

纵饮孤独 提交于 2019-12-02 23:47:15
I'm principally interested in the implementation of SecRandomCopyBytes on iOS , if it differs from the OS X implementation. (I would presume that it does, since a mobile device has more and more readily available sources of entropy than a desktop computer.) Does anyone have information on: Where SecRandomCopyBytes gets entropy from? What rate it can generate good random numbers? Will it block, or fail immediately if not enough entropy is available? Is it FIPS 140-2 compliant, or has it been included in any other official certification? The documentation does not cover these points. I've only

sources of “uniqueness”/entropy on embedded systems

怎甘沉沦 提交于 2019-12-02 23:07:25
I have an embedded system. What I would like for it to do when it powers up or otherwise resets, is to generate a unique ID, so that on different restarts a different unique ID is generated with high probability. It does not have access to a real-time clock, but it does have access to an ADC and a UART. I am wondering if there is a decent way to gather entropy from these sources to generate a unique ID. I am vaguely familiar with Yarrow . Is there a good way to use this? Unfortunately I do not have any noise sources of predictable characteristics; the ADC is connected to a number of relatively