probability

Poisson point process in matlab

那年仲夏 提交于 2019-12-04 22:17:47
I am new with poisson point process. I did one simluation (matlab) as below. My intensity lambda = 50 ; clear all; lambda=50; npoints = poissrnd(lambda); pproc = rand(npoints, 2); plot(pproc(:, 1), pproc(:, 2), '.'); Then I have plot, However, the link http://connor-johnson.com/2014/02/25/spatial-point-processes/ showed me that when intensity lamuda = 0.2, smaller than 1 , he got The link also showed the code in Python.Please check it. Here is my question, why intensity is smaller than 1 , he still can plot something here? If I let my code's lamda = 0.2, there will be no value to plot. I think

MATLAB code for a lot of Gaussian Mixture Model

泪湿孤枕 提交于 2019-12-04 15:17:49
I have applied gaussmix function in voicebox MATLAB tools to calculate GMM. However, the code gives me error when I run it for 512 GMM components. No_of_Clusters = 512; No_of_Iterations = 10; [m_ubm1,v_ubm1,w_ubm1]=gaussmix(feature,[],No_of_Iterations,No_of_Clusters); Error using * Inner matrix dimensions must agree. Error in gaussmix (line 256) pk=px*wt; % pk(k,1) effective number of data points for each mixture (could be zero due to underflow) I need 1024 or 2048 Mixtures for Universal Background Model (UBM) construction. Could anyone give me matlab code to calculate GMM for big number of

Probability distribution function in Python

こ雲淡風輕ζ 提交于 2019-12-04 13:51:41
I know how to create an histogram in Python, but I would like that it is the probability density distribution. Let's start with my example. I have an array d , with a size of 500000 elements. With the following code I am building a simple histogram telling me how many elements of my array d are between every bin. max_val=log10(max(d)) min_val=log10(min(d)) logspace = np.logspace(min_val, max_val, 50) H=hist(select,bins=logspace,histtype='step') The problem is that this plot is not what I want. I would like to have the probability distribution function of my array d . Instead of having the

Matplotlib: How to convert a histogram to a discrete probability mass function?

人盡茶涼 提交于 2019-12-04 13:10:31
I have a question regarding the hist() function with matplotlib. I am writing a code to plot a histogram of data who's value varies from 0 to 1. For example: values = [0.21, 0.51, 0.41, 0.21, 0.81, 0.99] bins = np.arange(0, 1.1, 0.1) a, b, c = plt.hist(values, bins=bins, normed=0) plt.show() The code above generates a correct histogram (I could not post an image since I do not have enough reputation). In terms of frequencies, it looks like: [0 0 2 0 1 1 0 0 1 1] I would like to convert this output to a discrete probability mass function, i.e. for the above example, I would like to get a

How to generate random numbers with predefined probability distribution?

孤街醉人 提交于 2019-12-04 12:59:33
问题 I would like to implement a function in python (using numpy ) that takes a mathematical function (for ex. p(x) = e^(-x) like below) as input and generates random numbers, that are distributed according to that mathematical-function's probability distribution. And I need to plot them, so we can see the distribution. I need actually exactly a random number generator function for exactly the following 2 mathematical functions as input, but if it could take other functions, why not: 1) p(x) = e^(

Estimating a probability given other probabilities from a prior

纵然是瞬间 提交于 2019-12-04 12:33:25
问题 I have a bunch of data coming in (calls to an automated callcenter) about whether or not a person buys a particular product, 1 for buy, 0 for not buy. I want to use this data to create an estimated probability that a person will buy a particular product, but the problem is that I may need to do it with relatively little historical data about how many people bought/didn't buy that product. A friend recommended that with Bayesian probability you can "help" your probability estimate by coming up

How can I compute the probability at a point given a normal distribution in Perl?

随声附和 提交于 2019-12-04 12:06:42
Is there a package in Perl that allows you to compute the height of probability distribution at each given point. For example this can be done in R this way: > dnorm(0, mean=4,sd=10) > 0.03682701 Namely the probability of point x=0 falls into a normal distribution, with mean=4 and sd=10, is 0.0368. I looked at Statistics::Distribution but it doesn't give that very function to do it. Why not something along these lines (I am writing in R, but it could be done in perl with Statistics::Distribution): dn <- function(x=0 # value ,mean=0 # mean ,sd=1 # sd ,sc=10000 ## scale the precision ) { res <-

How to determine probability of words?

有些话、适合烂在心里 提交于 2019-12-04 11:35:57
I have two documents. Doc1 is in the below format: TOPIC: 0 5892.0 site 0.0371690427699 Internet 0.0261371350984 online 0.0229124236253 web 0.0218940936864 say 0.0159538357094 TOPIC: 1 12366.0 web 0.150331554262 site 0.0517548115801 say 0.0451237263464 Internet 0.0153647096879 online 0.0135856380398 ...and so on till Topic 99 in the same pattern. And Doc2 is in the format: 0 0.566667 0 0.0333333 0 0 0 0.133333 .......... and so on... There are totally 100 values each value for each topic. Now, I have to find the weighted average probability for each word, that is: P(w) = alpha.P(w1)+ alpha.P

Expected collisions for perfect 32bit crc

主宰稳场 提交于 2019-12-04 11:11:48
问题 I'm trying to determine how my crc compares to an " ideal " 32bit crc. So I ran my crc over 1 million completely random samples of data and collected the amount of collisions, I want to compare this number to the number of collisions I could expect from the " ideal " crc. Does anyone know how to calculate the expected collision for an " ideal " 32bit crc? 回答1: Compare your own CRC with 0x1EDC6F41 as your "ideal" reference. Having said that, there is no ideal 32-bit CRC. Different polynomials

Underflow in Forward Algorithm for HMMs

蹲街弑〆低调 提交于 2019-12-04 08:51:21
I'm implementing the forward algorithm for HMMs to calculate the probability of a given HMM emitting a given observation sequence. I'd like my algorithm to be robust to underflow. I can't work in log-space because the forward algorithm requires the multiplication AND addition of probabilities. What is the best way to avoid underflow? I've read some sources about this but the best suggestion I get is scaling the probabilities at each time step Section 6 Here . By the end of the algorithm you won't be left with the exact probability you want (of the observation sequence). Also, unless I'm