theano

Theano.function equivalent in Tensorflow

孤人 提交于 2019-12-02 06:06:42
I am wondering if there is any equivalent to theano.function(inputs=[x,y], # list of input variables outputs=..., # what values to be returned updates=..., # “state” values to be modified givens=..., # substitutions to the graph) in TensorFlow The run method on the tf.Session class is quite close to theano.function . Its fetches and feed_dict arguments are moral equivalents of outputs and givens . Theano's function returns an object that acts like a Python function and executes the computational graph when called. In TensorFlow, you execute computational graph using session's run method. If

Issue with simple CAE

孤人 提交于 2019-12-02 04:36:50
It looks like simple CAE not working for Carvana dataset I’m trying simple CAE for Carvana dataset. You can download it here My code is following: import numpy as np import pandas as pd import matplotlib.pyplot as plt from skimage.io import imread from skimage.transform import downscale_local_mean from skimage.color import rgb2grey from os.path import join, isfile from tqdm import tqdm_notebook from sklearn.model_selection import train_test_split from keras.layers import Conv2D, MaxPooling2D, Conv2DTranspose, Input, concatenate from keras.models import Model from keras.callbacks import

Unable to create lambda function in hierarchical pymc3 model

空扰寡人 提交于 2019-12-02 04:22:02
I'm trying to create the model shown below with PyMC 3 but can't figure out how to properly map probabilities to the observed data with a lambda function. import numpy as np import pymc as pm data = np.array([[0, 0, 1, 1, 2], [0, 1, 2, 2, 2], [2, 2, 1, 1, 0], [1, 1, 2, 0, 1]]) (D, W) = data.shape V = len(set(data.ravel())) T = 3 a = np.ones(T) b = np.ones(V) with pm.Model() as model: theta = [pm.Dirichlet('theta_%s' % i, a, shape=T) for i in range(D)] z = [pm.Categorical('z_%i' % i, theta[i], shape=W) for i in range(D)] phi = [pm.Dirichlet('phi_%i' % i, b, shape=V) for i in range(T)] w = [pm

logistic_sgd module, where to find it?

邮差的信 提交于 2019-12-01 22:44:03
问题 doing a deep learning tutorial and my python cannot find that module. from logistic_sgd import LogisticRegression, load_data ImportError: No module named logistic_sgd How can i install it ? 回答1: Download and Save logistic_sgd.py from the following link: http://deeplearning.net/tutorial/code/logistic_sgd.py Store it in the working directory. That's it! 回答2: Actually, you should download all the code from the following link, it contains all the dependencies you need. Here is the source code:

logistic_sgd module, where to find it?

旧巷老猫 提交于 2019-12-01 20:47:58
doing a deep learning tutorial and my python cannot find that module. from logistic_sgd import LogisticRegression, load_data ImportError: No module named logistic_sgd How can i install it ? Download and Save logistic_sgd.py from the following link: http://deeplearning.net/tutorial/code/logistic_sgd.py Store it in the working directory. That's it! Actually, you should download all the code from the following link, it contains all the dependencies you need. Here is the source code: http://deeplearning.net/tutorial/code/ To download all the code necessary, you can go to GitHub. Citing from: http:

3d sliding window operation in Theano?

爷,独闯天下 提交于 2019-12-01 15:40:45
TL.DR. Is there a 3-dimensional friendly implementation of theano.tensor.nnet.neighbours.images2neibs ? I would like to perform voxel-wise classification of a volume (NxNxN) using a neural network that takes in a nxnxn image, where N>n. To classify each voxel in the volume, I have to iterate through each voxel. For each iterration, I obtain and pass the neighborhood voxels as the input to the neural network. This is simply a sliding window operation, which the operation is the neural network. While my neural network is implemented in Theano, the sliding window implementation is in python/numpy

3d sliding window operation in Theano?

回眸只為那壹抹淺笑 提交于 2019-12-01 13:46:02
问题 TL.DR. Is there a 3-dimensional friendly implementation of theano.tensor.nnet.neighbours.images2neibs? I would like to perform voxel-wise classification of a volume (NxNxN) using a neural network that takes in a nxnxn image, where N>n. To classify each voxel in the volume, I have to iterate through each voxel. For each iterration, I obtain and pass the neighborhood voxels as the input to the neural network. This is simply a sliding window operation, which the operation is the neural network.

perform the exact same convolution as in theano's conv2d

感情迁移 提交于 2019-12-01 11:59:35
I have an existing classification model that was trained using theano's conv2d under theano.tensor.nnet. Now I have to use this model to do some sort of prediction in Java. I implement a simple convolution in Python(In the end, I will code it in Java) as per some documentation( https://developer.apple.com/Library/ios/documentation/Performance/Conceptual/vImage/ConvolutionOperations/ConvolutionOperations.html ). For example, for a 2*2 kernel (k11,k12,k21,k22), one of the areas under the kernel is (a11,a12,a21,a22). The convolution is performed by a11*k11 + a12*k12 + a21*k21 + a22*k22.

How to write a custom Deterministic or Stochastic in pymc3 with theano.op?

笑着哭i 提交于 2019-12-01 10:29:54
I'm doing some pymc3 and I would like to create custom Stochastics, however there doesn't seem to be a lot documentation about how it's done. I know how to use the as_op way , however apparently that makes it impossible to use the NUTS sampler, in which case I don't see the advantage of pymc3 over pymc. The tutorial mentions that it can be done by inheriting from theano.Op. But can anyone show me how that would work (I'm still getting started on theano)? I have two Stochastics that I want to define. The first one should be easier, it's an N dimension vector F that has only constant parent

How to write a custom Deterministic or Stochastic in pymc3 with theano.op?

∥☆過路亽.° 提交于 2019-12-01 07:36:41
问题 I'm doing some pymc3 and I would like to create custom Stochastics, however there doesn't seem to be a lot documentation about how it's done. I know how to use the as_op way, however apparently that makes it impossible to use the NUTS sampler, in which case I don't see the advantage of pymc3 over pymc. The tutorial mentions that it can be done by inheriting from theano.Op. But can anyone show me how that would work (I'm still getting started on theano)? I have two Stochastics that I want to