tensorflow-estimator

How to create a variable that persists over tf.estimator.train_and_evaluate evaluation steps?

寵の児 提交于 2021-02-19 08:38:24
问题 TLDR: How to create a variable that holds the confusion matrix used for computing custom metrics, accumulating the values across all of the evaluation steps? I have implemented custom metrics to use in the tf.estimator.train_and_evaluation pipeline, with a confusion matrix as the crux for them all. My goal is to make this confusion matrix persist over multiple evaluation steps in order to track the learning progress. Using get_variable in the variable scope did not work, since it does not

TensorFlow Estimator ServingInputReceiver features vs receiver_tensors: when and why?

时间秒杀一切 提交于 2021-02-15 10:04:26
问题 In a previous question the purpose and structure of the serving_input_receiver_fn is explored and in the answer: def serving_input_receiver_fn(): """For the sake of the example, let's assume your input to the network will be a 28x28 grayscale image that you'll then preprocess as needed""" input_images = tf.placeholder(dtype=tf.uint8, shape=[None, 28, 28, 1], name='input_images') # here you do all the operations you need on the images before they can be fed to the net (e.g., normalizing,

TensorFlow Estimator ServingInputReceiver features vs receiver_tensors: when and why?

房东的猫 提交于 2021-02-15 10:03:38
问题 In a previous question the purpose and structure of the serving_input_receiver_fn is explored and in the answer: def serving_input_receiver_fn(): """For the sake of the example, let's assume your input to the network will be a 28x28 grayscale image that you'll then preprocess as needed""" input_images = tf.placeholder(dtype=tf.uint8, shape=[None, 28, 28, 1], name='input_images') # here you do all the operations you need on the images before they can be fed to the net (e.g., normalizing,

TensorFlow Estimator ServingInputReceiver features vs receiver_tensors: when and why?

允我心安 提交于 2021-02-15 10:03:16
问题 In a previous question the purpose and structure of the serving_input_receiver_fn is explored and in the answer: def serving_input_receiver_fn(): """For the sake of the example, let's assume your input to the network will be a 28x28 grayscale image that you'll then preprocess as needed""" input_images = tf.placeholder(dtype=tf.uint8, shape=[None, 28, 28, 1], name='input_images') # here you do all the operations you need on the images before they can be fed to the net (e.g., normalizing,

tf.estimator.add_metrics ends in Shapes (None, 12) and (None,) are incompatible

为君一笑 提交于 2021-02-11 15:14:32
问题 I am using a DNNClassifier as my estimator and wanted to add some additional metrics to the estimator. the code I am using is basically the one from the tf.estimator.add_metrics documentation (https://www.tensorflow.org/api_docs/python/tf/estimator/add_metrics). def my_auc(labels, predictions): auc_metric = tf.keras.metrics.AUC(name="my_auc") auc_metric.update_state(y_true=labels, y_pred=predictions['logits']) return {'auc': auc_metric} hidden_layers = len(training_data.__call__().element

TensorFlow v2: Replacement for tf.contrib.predictor.from_saved_model

∥☆過路亽.° 提交于 2020-12-31 04:59:02
问题 So far, I was using tf.contrib.predictor.from_saved_model to load a SavedModel ( tf.estimator model class). However, this function has unfortunately been removed in TensorFlow v2. So far, in TensorFlow v1, my coding was the following: predict_fn = predictor.from_saved_model(model_dir + '/' + model, signature_def_key='predict') prediction_feed_dict = dict() for key in predict_fn._feed_tensors.keys(): #forec_data is a DataFrame holding the data to be fed in for index in forec_data.index:

TensorFlow v2: Replacement for tf.contrib.predictor.from_saved_model

允我心安 提交于 2020-12-31 04:57:18
问题 So far, I was using tf.contrib.predictor.from_saved_model to load a SavedModel ( tf.estimator model class). However, this function has unfortunately been removed in TensorFlow v2. So far, in TensorFlow v1, my coding was the following: predict_fn = predictor.from_saved_model(model_dir + '/' + model, signature_def_key='predict') prediction_feed_dict = dict() for key in predict_fn._feed_tensors.keys(): #forec_data is a DataFrame holding the data to be fed in for index in forec_data.index:

TensorFlow v2: Replacement for tf.contrib.predictor.from_saved_model

℡╲_俬逩灬. 提交于 2020-12-31 04:56:47
问题 So far, I was using tf.contrib.predictor.from_saved_model to load a SavedModel ( tf.estimator model class). However, this function has unfortunately been removed in TensorFlow v2. So far, in TensorFlow v1, my coding was the following: predict_fn = predictor.from_saved_model(model_dir + '/' + model, signature_def_key='predict') prediction_feed_dict = dict() for key in predict_fn._feed_tensors.keys(): #forec_data is a DataFrame holding the data to be fed in for index in forec_data.index:

Tensorflow : logits and labels must have the same first dimension

蹲街弑〆低调 提交于 2020-08-01 09:10:29
问题 I am new in tensoflow and I want to adapt the MNIST tutorial https://www.tensorflow.org/tutorials/layers with my own data (images of 40x40). This is my model function : def cnn_model_fn(features, labels, mode): # Input Layer input_layer = tf.reshape(features, [-1, 40, 40, 1]) # Convolutional Layer #1 conv1 = tf.layers.conv2d( inputs=input_layer, filters=32, kernel_size=[5, 5], # To specify that the output tensor should have the same width and height values as the input tensor # value can be

Tensorflow : logits and labels must have the same first dimension

强颜欢笑 提交于 2020-08-01 09:10:06
问题 I am new in tensoflow and I want to adapt the MNIST tutorial https://www.tensorflow.org/tutorials/layers with my own data (images of 40x40). This is my model function : def cnn_model_fn(features, labels, mode): # Input Layer input_layer = tf.reshape(features, [-1, 40, 40, 1]) # Convolutional Layer #1 conv1 = tf.layers.conv2d( inputs=input_layer, filters=32, kernel_size=[5, 5], # To specify that the output tensor should have the same width and height values as the input tensor # value can be