How to use StreamingDataFeeder as contrib.learn.Estimator.fit()'s input_fn?

醉酒当歌 提交于 2019-12-11 07:12:18

问题


I have recently started using tensorflow.contrib.learn (skflow) library and really like it. However, I am facing an issue with using Estimator, the fit function uses either

  1. (X, Y, and batch_size) - the problem with this approach is that it does not support provision for specifying number of epochs and allowing arbitrary source of data.
  2. input_fn - besides, setting epochs, it gives me much more flexibility on source of training ( which in my case is coming directly from a database).

Now I am aware that I could create input_fn which reads files, however, as I am not interested in dealing with files, the following functions are not useful for me -

  • tf.contrib.learn.read_batch_examples
  • tf.contrib.learn.read_batch_features
  • tf.contrib.learn.read_batch_record_features

Ideally, I would like to use StreamingDataFeeder as input_fn. Any ideas how I can achieve this?


回答1:


StreamingDataFeeder is used when you provide iterators as x / y to fit/predict/evaluate of Estimator.

Example:

x = (np.array([i]) for i in xrange(10**10)) # use range for python >=3.0
y = (np.array([i + 1]) for i in xrange(10**10))
lr = tf.contrib.learn.LinearRegressor(
    feature_columns=[tf.contrib.layers.real_valued_column('')])

# only consumes 1000*10 values from iterators.
lr.fit(x, y, steps=1000, batch_size=10)

If you want to use input_fn for feeding data - you need to use graph operations to read / process data. For example you can create a C++ operation that will produce your data (it can be listening port or reading from database Op) and convert into Tensor. Mainly this is good for reading data from files, but other readers can be implemented as well.



来源:https://stackoverflow.com/questions/39855375/how-to-use-streamingdatafeeder-as-contrib-learn-estimator-fits-input-fn

易学教程内所有资源均来自网络或用户发布的内容,如有违反法律规定的内容欢迎反馈
该文章没有解决你所遇到的问题?点击提问,说说你的问题,让更多的人一起探讨吧!