multiclass-classification

InvalidArgumentError: 2 root error(s) found. (0) Invalid argument: Incompatible shapes: [4,3] vs. [4,4]

六月ゝ 毕业季﹏ 提交于 2021-01-29 18:54:42
问题 I am facing below error when trying to train a multi-class classification model ( 4 classes) for Image dataset. Even though my output tensor is of shape 4 I am facing below issue. Please let me know how to fix this issue. Epoch 1/10 --------------------------------------------------------------------------- InvalidArgumentError Traceback (most recent call last) <ipython-input-30-01c6f78f4d4f> in <module> 4 epochs=epochs, 5 validation_data=val_data_gen, ----> 6 validation_steps=total_val //

Spacy's BERT model doesn't learn

≯℡__Kan透↙ 提交于 2020-08-03 09:25:32
问题 I've been trying to use spaCy 's pretrained BERT model de_trf_bertbasecased_lg to increase accuracy in my classification project. I used to build a model from scratch using de_core_news_sm and everything worked fine: I had an accuracy around 70%. But now I am using BERT pretrained model instead and I'm getting 0% accuracy. I don't believe that it's working so bad, so I'm assuming that there is just a problem with my code. I might have missed something important but I can't figure out what. I

Spacy's BERT model doesn't learn

孤人 提交于 2020-08-03 09:23:47
问题 I've been trying to use spaCy 's pretrained BERT model de_trf_bertbasecased_lg to increase accuracy in my classification project. I used to build a model from scratch using de_core_news_sm and everything worked fine: I had an accuracy around 70%. But now I am using BERT pretrained model instead and I'm getting 0% accuracy. I don't believe that it's working so bad, so I'm assuming that there is just a problem with my code. I might have missed something important but I can't figure out what. I

Why variable importance is not reflected in variable actually used in tree construction?

核能气质少年 提交于 2020-06-28 05:10:10
问题 I generated an (unpruned) classification tree on R with the following code: fit <- rpart(train.set$line ~ CountryCode + OrderType + Bon + SupportCode + prev_AnLP + prev_TXLP + prev_ProfLP + prev_EVProfLP + prev_SplLP + Age + Sex + Unknown.Position + Inc + Can + Pre + Mol, data=train.set, control=rpart.control(minsplit=5, cp=0.001), method="class") printcp(fit) shows: Variables actually used in tree construction: Age CountryCode SupportCode OrderType prev_AnLP prev_EVProfLP prev_ProfLP prev

Imbalanced classes in multi-class classification problem

北战南征 提交于 2020-06-25 09:18:28
问题 I'm trying to use TensorFlow's DNNClassifier for my multi-class (softmax) classification problem with 4 different classes. I have an imbalanced dataset with the following distribution: Class 0: 14.8% Class 1: 35.2% Class 2: 27.8% Class 3: 22.2% How do I assign the weights for the DNNClassifier's weight_column for each class? I know how to code this, but I am wondering what values should I give for each class. 回答1: there are various options to build weights for un unbalance classification

What is the use of base_score in xgboost multiclass working?

断了今生、忘了曾经 提交于 2020-06-17 09:33:40
问题 The bounty expires in 7 days . Answers to this question are eligible for a +100 reputation bounty. jared_mamrot is looking for an answer from a reputable source : This bounty is for a reproducible example illustrating the application of base_score or base_margin to a multiclass XGBoost classification problem (softmax or softprob) using R. I am trying to explore the working of Xgboost binary classification as well as for multi-class. In case of binary class, i observed that base_score is

Multiclass classification: probabilities and calibration

橙三吉。 提交于 2020-02-21 07:02:49
问题 I'm working on a multiclass classification problem with different classifiers, working with Python and scikit-learn. I want to use the predicted probabilities, basically to compare the predicted probabilities of the different classifiers for a specific case. I started reading about 'calibration' (here and here for example) and I became confused. For what I understood: a well-calibrated probability means that that a probability also reflects the fraction of a certain class. 1) Does this imply

Multiclass classification: probabilities and calibration

有些话、适合烂在心里 提交于 2020-02-21 07:02:04
问题 I'm working on a multiclass classification problem with different classifiers, working with Python and scikit-learn. I want to use the predicted probabilities, basically to compare the predicted probabilities of the different classifiers for a specific case. I started reading about 'calibration' (here and here for example) and I became confused. For what I understood: a well-calibrated probability means that that a probability also reflects the fraction of a certain class. 1) Does this imply