expectation-maximization

GMM/EM on time series cluster

你。 提交于 2021-02-08 10:07:40
问题 According to a paper, it is supposed to work. But as a learner of scikit-learn package.. I do not see how. All the sample codes cluster by ellipses or circles as here. I would really like to know how to cluster the following plot by different patterns... 0 -3 are the mean of power over certain time periods (divided into 4) while 4, 5, 6 each correspond to standard deviation of the year, variance in weekday/weekend, variance in winter/summer. So the ylabel does not necessarily meet with 4,5,6.

Numpy __array_prepare__ error

主宰稳场 提交于 2020-01-05 09:16:21
问题 I'm trying to get a recipe working that I found online for doing expectation maximization (http://code.activestate.com/recipes/577735-expectation-maximization/). I run into the following error: Traceback (most recent call last): File "./runem.py", line 7, in <module> print expectation_maximization([[1,2,3,4,5],[2,3,4,5,6],[9,8,7,4,1]], 2) File "/local/scratch-3/dk427/rp/em.py", line 83, in expectation_maximization Px[o,c] = pnorm(t[o,:], params[c]['mu'], params[c]['sigma']) File "/local

What is an intuitive explanation of the Expectation Maximization technique? [closed]

北慕城南 提交于 2019-12-29 10:08:51
问题 Closed . This question needs to be more focused. It is not currently accepting answers. Want to improve this question? Update the question so it focuses on one problem only by editing this post. Closed last year . Expectation Maximization (EM) is a kind of probabilistic method to classify data. Please correct me if I am wrong if it is not a classifier. What is an intuitive explanation of this EM technique? What is expectation here and what is being maximized ? 回答1: Note: the code behind this

Expectation Maximization coin toss examples

醉酒当歌 提交于 2019-12-13 11:41:41
问题 I've been self-studying the Expectation Maximization lately, and grabbed myself some simple examples in the process: http://cs.dartmouth.edu/~cs104/CS104_11.04.22.pdf There are 3 coins 0, 1 and 2 with P0, P1 and P2 probability landing on Head when tossed. Toss coin 0, if the result is Head, toss coin 1 three times else toss coin 2 three times. The observed data produced by coin 1 and 2 is like this: HHH, TTT, HHH, TTT, HHH. The hidden data is coin 0's result. Estimate P0, P1 and P2. http://ai

Expectation Maximization get covs function not working on OpenCV 2.4.6 and number of cluster change after train function

荒凉一梦 提交于 2019-12-11 13:13:47
问题 I have two questions. First one is why ncluster switch from 10 to 80 after the train function. Second: I am passing my code from C to C++ with OpenCV but it seems there are some problems with it. I am having an exception when I try to get the covs of my model, this is the code: int nclusters = 10; // Here nclusters is 10 EM em_model(nclusters, EM::COV_MAT_GENERIC); bool isTrained = em_model.train(samples); // Here nclusters is 80 Mat means = em_model.get<Mat>("means"); Mat weights = em_model

Equidistant points across a cube

↘锁芯ラ 提交于 2019-12-04 10:34:24
I need to initialize some three dimensional points, and I want them to be equally spaced throughout a cube. Are there any creative ways to do this? I am using an iterative Expectation Maximization algorithm and I want my initial vectors to "span" the space evenly. For example, suppose I have eight points that I want to space equally in a cube sized 1x1x1. I would want the points at the corners of a cube with a side length of 0.333, centered within the larger cube. A 2D example is below. Notice that the red points are equidistant from eachother and the edges. I want the same for 3D. In cases

Numeric example of the Expectation Maximization Algorithm [duplicate]

孤人 提交于 2019-12-03 14:37:50
问题 This question already has answers here : What is an intuitive explanation of the Expectation Maximization technique? [closed] (8 answers) Closed last year . Could anyone provide a simple numeric example of the EM algorithm as I am not sure about the formulas given? A really simple one with 4 or 5 Cartesian coordinates would perfectly do. 回答1: what about this: http://en.wikibooks.org/wiki/Data_Mining_Algorithms_In_R/Clustering/Expectation_Maximization_(EM)#A_simple_example I had also written a

Numeric example of the Expectation Maximization Algorithm [duplicate]

喜你入骨 提交于 2019-12-03 03:27:08
This question already has answers here : What is an intuitive explanation of the Expectation Maximization technique? [closed] (8 answers) Could anyone provide a simple numeric example of the EM algorithm as I am not sure about the formulas given? A really simple one with 4 or 5 Cartesian coordinates would perfectly do. what about this: http://en.wikibooks.org/wiki/Data_Mining_Algorithms_In_R/Clustering/Expectation_Maximization_(EM)#A_simple_example I had also written a simple example in (edit)R a year ago, unfortunately I am unable to locate it. I'll try again to find it later. EDIT: Here it

What is an intuitive explanation of the Expectation Maximization technique? [closed]

蹲街弑〆低调 提交于 2019-11-29 18:34:10
Expectation Maximization (EM) is a kind of probabilistic method to classify data. Please correct me if I am wrong if it is not a classifier. What is an intuitive explanation of this EM technique? What is expectation here and what is being maximized ? Note: the code behind this answer can be found here . Suppose we have some data sampled from two different groups, red and blue: Here, we can see which data point belongs to the red or blue group. This makes it easy to find the parameters that characterise each group. For example, the mean of the red group is around 3, the mean of the blue group