multinomial


Fitted values for multinom in R: Coefficients for Reference Category?

眉间皱痕 提交于 2020-01-23 02:14:17
问题 I'm using the function multinom from the nnet package to run a multinomial logistic regression. In multinomial logistic regression, as I understand it, the coefficients are the changes in the log of the ratio of the probability of a response over the probability of the reference response (i.e., ln(P( i )/P( r ))=B 1 +B 2 *X... where i is one response category, r is the reference category, and X is some predictor). However, fitted(multinom(...)) produces estimates for each category, even the

How to interpret the output of choicemodelr (rhierMnlRwMixture) in R

送分小仙女□ 提交于 2020-01-14 01:58:06
问题 My Problem I just started using the R library 'choicemodelr' and succeded in getting some beta values as a solution. But I wonder how do I assign these values to the specific attribute-levels. As a result I only get values for A1B1, A1B2, A1B3,... etc. How does this generic output generally connect to my Design? Didn't find a hint in the documentation. Neither for the choicemodelr libraray, nor the bayesm library (rhierMnlRwMixture) to which it is connected to. I hope you can help me with

Multinomial distribution in PyMC

烂漫一生 提交于 2020-01-02 22:11:12
问题 I am a newbie to pymc. I have read the required stuff on github and was doing fine till I was stuck with this problem. I want to make a collection of multinomial random variables which I can later sample using mcmc. But the best I can do is rv = [ Multinomial("rv", count[i], p_d[i]) for i in xrange(0, len(count)) ] for i in rv: print i.value i.random() for i in rv: print i.value But it is of no good since I want to be able to call rv.value and rv.random() , otherwise I won't be able to sample

GBM multinomial distribution, how to use predict() to get predicted class?

独自空忆成欢 提交于 2019-12-30 08:27:47
问题 I am using the multinomial distribution from the gbm package in R. When I use the predict function, I get a series of values: 5.086328 -4.738346 -8.492738 -5.980720 -4.351102 -4.738044 -3.220387 -4.732654 but I want to get the probability of each class occurring. How do I recover the probabilities? Thank You. 回答1: Take a look at ?predict.gbm , you'll see that there is a "type" parameter to the function. Try out predict(<gbm object>, <new data>, type="response") . 回答2: predict.gbm(..., type=

multinominal regression with imputed data

耗尽温柔 提交于 2019-12-24 06:21:32
问题 I need to impute missing data and then coduct multinomial regression with the generated datasets. I have tried using mice for the imputing and then multinom function from nnet for the multnomial regression. But this gives me unreadable output. Here is an example using the nhanes2 dataset available with the mice package: library(mice) library(nnet) test <- mice(nhanes2, meth=c('sample','pmm','logreg','norm')) #age is categorical, bmi is continuous m <- with(test, multinom(age ~ bmi, model = T)

How to get average marginal effects (AMEs) with standard errors of a multinomial logit model?

核能气质少年 提交于 2019-12-23 22:43:04
问题 I want to get the average marginal effects (AME) of a multinomial logit model with standard errors. For this I've tried different methods, but they haven't led to the goal so far. Best attempt My best attempt was to get the AMEs by hand using mlogit which I show below. library(mlogit) ml.d <- mlogit.data(df1, choice="Y", shape="wide") # shape data for `mlogit()` ml.fit <- mlogit(Y ~ 1 | D + x1 + x2, reflevel="1", data=ml.d) # fit the model # coefficient names c.names <- names(ml.fit$model)[-

Multinomial Naive Bayes Classifier

怎甘沉沦 提交于 2019-12-22 06:46:59
问题 I have been looking for a multinomial naive Bayes classifier on CRAN, and so far all I can come up with is the binomial implementation in package e1071 . Does anyone know of a package that has a multinomial Bayes classifier? 回答1: bnlearn not doing it for you? http://www.bnlearn.com/ Is on CRAN, and claims to implement "naive Bayes" network classifiers and "Discrete (multinomial) data sets are supported". 来源: https://stackoverflow.com/questions/8874058/multinomial-naive-bayes-classifier

Efficient Matlab implementation of Multinomial Coefficient

风流意气都作罢 提交于 2019-12-21 06:16:40
问题 I want to calculate the multinomial coefficient: where it is satisifed n=n0+n1+n2 The Matlab implementation of this operator can be easily done in the function: function N = nchooseks(k1,k2,k3) N = factorial(k1+k2+k3)/(factorial(k1)*factorial(k2)*factorial(k3)); end However, when the index is larger than 170, the factorial would be infinite which would generate NaN in some cases, e.g. 180!/(175! 3! 2!) -> Inf/Inf-> NaN . In other posts, they have solved this overflow issue for C and Python.

In gbm multinomial dist, how to use predict to get categorical output? [duplicate]

☆樱花仙子☆ 提交于 2019-12-20 12:31:09
问题 This question already has answers here : GBM multinomial distribution, how to use predict() to get predicted class? (2 answers) Closed 4 years ago . My response is a categorical variable (some alphabets), so I used distribution='multinomial' when making the model, and now I want to predict the response and obtain the output in terms of these alphabets, instead of matrix of probabilities. However in predict(model, newdata, type='response') , it gives probabilities, same as the result of type=

How does multinom() treat NA values by default?

旧巷老猫 提交于 2019-12-20 07:35:42
问题 When I am running multinom() , say Y ~ X1 + X2 + X3 , if for one particular row X1 is NA (i.e. missing), but Y , X2 and X3 all have a value, would this entire row be thrown out (like it does in SAS)? How are missing values treated in multinom() ? 回答1: Here is a simple example (from ?multinom from the nnet package) to explore the different na.action : > library(nnet) > library(MASS) > example(birthwt) > (bwt.mu <- multinom(low ~ ., bwt)) Intentionally create a NA value: > bwt[1,"age"]<-NA #

工具导航Map