text-classification

How do I determine the binary class predicted by a convolutional neural network on Keras?

泪湿孤枕 提交于 2020-08-17 07:14:48
问题 I'm building a CNN to perform sentiment analysis on Keras. Everything is working perfectly, the model is trained and ready to be launched to production. However, when I try to predict on new unlabelled data by using the method model.predict() it only outputs the associated probability. I tried to use the method np.argmax() but it always outputs 0 even when it should be 1 (on test set, my model achieved 80% of accuracy). Here is my code to pre-process the data: # Pre-processing data x = df[df

How do I determine the binary class predicted by a convolutional neural network on Keras?

你离开我真会死。 提交于 2020-08-17 07:13:09
问题 I'm building a CNN to perform sentiment analysis on Keras. Everything is working perfectly, the model is trained and ready to be launched to production. However, when I try to predict on new unlabelled data by using the method model.predict() it only outputs the associated probability. I tried to use the method np.argmax() but it always outputs 0 even when it should be 1 (on test set, my model achieved 80% of accuracy). Here is my code to pre-process the data: # Pre-processing data x = df[df

TF-IDF Vectors can be generated at different levels of input tokens (words, characters, n-grams), which accuracy should be considered?

淺唱寂寞╮ 提交于 2020-08-10 19:20:25
问题 Here you can see i am calculating the frequency at Count Vectors,WordLevel, N-Gram Vectors accuracy = train_model( classifier, xtrain_count, train_y, xvalid_count) print("NB, Count Vectors: ", accuracy) # Naive Bayes on Word Level TF IDF Vectors accuracy = train_model(classifier, xtrain_tfidf, train_y, xvalid_tfidf) print("NB, WordLevel TF-IDF: ", accuracy) # Naive Bayes on Ngram Level TF IDF Vectors accuracy = train_model(classifier, xtrain_tfidf_ngram, train_y, xvalid_tfidf_ngram) print("NB

TF-IDF Vectors can be generated at different levels of input tokens (words, characters, n-grams), which accuracy should be considered?

陌路散爱 提交于 2020-08-10 19:18:57
问题 Here you can see i am calculating the frequency at Count Vectors,WordLevel, N-Gram Vectors accuracy = train_model( classifier, xtrain_count, train_y, xvalid_count) print("NB, Count Vectors: ", accuracy) # Naive Bayes on Word Level TF IDF Vectors accuracy = train_model(classifier, xtrain_tfidf, train_y, xvalid_tfidf) print("NB, WordLevel TF-IDF: ", accuracy) # Naive Bayes on Ngram Level TF IDF Vectors accuracy = train_model(classifier, xtrain_tfidf_ngram, train_y, xvalid_tfidf_ngram) print("NB

Spacy's BERT model doesn't learn

≯℡__Kan透↙ 提交于 2020-08-03 09:25:32
问题 I've been trying to use spaCy 's pretrained BERT model de_trf_bertbasecased_lg to increase accuracy in my classification project. I used to build a model from scratch using de_core_news_sm and everything worked fine: I had an accuracy around 70%. But now I am using BERT pretrained model instead and I'm getting 0% accuracy. I don't believe that it's working so bad, so I'm assuming that there is just a problem with my code. I might have missed something important but I can't figure out what. I

Spacy's BERT model doesn't learn

孤人 提交于 2020-08-03 09:23:47
问题 I've been trying to use spaCy 's pretrained BERT model de_trf_bertbasecased_lg to increase accuracy in my classification project. I used to build a model from scratch using de_core_news_sm and everything worked fine: I had an accuracy around 70%. But now I am using BERT pretrained model instead and I'm getting 0% accuracy. I don't believe that it's working so bad, so I'm assuming that there is just a problem with my code. I might have missed something important but I can't figure out what. I

Predictor prediction function to classify text using a saved BERT model

六月ゝ 毕业季﹏ 提交于 2020-07-22 06:41:34
问题 I have created a BERT model for classifying a user generated text string as FAQ or not FAQ. I have saved my model using the export_savedmodel() function. I wish to write a function to predict the output for a new set of strings, which takes as input a list of the strings. I tried using predictor.from_saved_model() method but that method requires passing key value pairs for input id, segment id, label id and input mask. I am a beginner and I do not understand completely what to pass here.

Predictor prediction function to classify text using a saved BERT model

半腔热情 提交于 2020-07-22 06:41:07
问题 I have created a BERT model for classifying a user generated text string as FAQ or not FAQ. I have saved my model using the export_savedmodel() function. I wish to write a function to predict the output for a new set of strings, which takes as input a list of the strings. I tried using predictor.from_saved_model() method but that method requires passing key value pairs for input id, segment id, label id and input mask. I am a beginner and I do not understand completely what to pass here.