I currently have two numpy arrays:
X - (157, 128) - 157 sets of 128 featuresY - (157) - classifications of the feature sets
self.name is not present in column_to_tensors dictionary that's what the error says and the value of self.name is an empty string tf.estimator.inputs.numpy_input_fnThe solution might be changing train_input_fn line to
train_input_fn = tf.estimator.inputs.numpy_input_fn(x=X,
y=Y,
num_epochs=None,
shuffle=True)
I think the x argument has to be a numpy array and you are giving it a dictionary
I will stick to their tutorial and do not do any fancy
real_feature_column = real_valued_column(...)
sparse_feature_column = sparse_column_with_hash_bucket(...)
estimator = SVM(
example_id_column='example_id',
feature_columns=[real_feature_column, sparse_feature_column],
l2_regularization=10.0)
# Input builders
def input_fn_train: # returns x, y
...
def input_fn_eval: # returns x, y
...
estimator.fit(input_fn=input_fn_train)
estimator.evaluate(input_fn=input_fn_eval)
estimator.predict(x=x)
===============UPDATED==============
self.name is an empty string and that empty string is not present in your dictionary that you are passing to infer_real_valued_columns_from_input that creates _RealValuedColumn objecttf.contrib.learn.infer_real_valued_columns_from_input(X) the X that you pass has to be a dictionary so that the self.name of _RealValuedColumn object is initialized by the key of the dictionary that you passSo this is what I did
import tensorflow as tf
import numpy as np
X = np.array([[1], [0], [0], [1]])
Y = np.array([[1], [0], [0], [1]])
dic = {"x": X}
train_input_fn = tf.estimator.inputs.numpy_input_fn(
x=dic,
y=Y,
num_epochs=None,
shuffle=True)
svm = tf.contrib.learn.SVM(example_id_column='x', feature_columns=tf.contrib.learn.infer_real_valued_columns_from_input(dic), l2_regularization=0.1)
svm.fit(input_fn=train_input_fn, steps=10)
Now this removes the above error an but it gives a new error TypeError: Input 'input' of 'SdcaFprint' Op has type int64 that does not match expected type of string.