Neural Network for regression

扶醉桌前 提交于 2019-12-24 16:36:01

问题


The way I understand regression for neural networks is weights being added to each x-input from the dataset. I want something slightly different.

I want weights added to the function that computes each x-input we'll call these s-inputs

The function to compute the x-inputs is a summation function of all s-inputs I want each s-input to have its own weight

So I say regression because I want the end result to be a beautiful continuous function between the mapping x -> y

...but that is accomplished through training the function that computes the x-inputs

It's baffling me because as we train the weights to compute say, x1 we are also training the weights to compute x2 since they are using the same summation function. So since the function to compute x-inputs is being trained simultaneously across all x-inputs, the plot x -> y will begin to morph. I need it to morph into something continuous.

You can think of it like this. The y-value is the ground truth, but we are adding weights to the function that computes the x-value -- the s-inputs

Can this be done? If so where should I start?

来源:https://stackoverflow.com/questions/34746497/neural-network-for-regression

易学教程内所有资源均来自网络或用户发布的内容,如有违反法律规定的内容欢迎反馈
该文章没有解决你所遇到的问题?点击提问,说说你的问题,让更多的人一起探讨吧!