Neural network sine approximation

后端 未结 3 1260
粉色の甜心
粉色の甜心 2020-12-07 05:07

After spending days failing to use neural network for Q learning, I decided to go back to the basics and do a simple function approximation to see if everything was working

3条回答
  •  攒了一身酷
    2020-12-07 05:47

    With these changes:

    • Activations to relu
    • Remove kernel_initializer (i.e. leave the default 'glorot_uniform')
    • Adam optimizer
    • 100 epochs

    i.e.

    regressor = Sequential()
    regressor.add(Dense(units=20, activation='relu', input_dim=1)) 
    regressor.add(Dense(units=20, activation='relu')) 
    regressor.add(Dense(units=20, activation='relu')) 
    regressor.add(Dense(units=1))
    regressor.compile(loss='mean_squared_error', optimizer='adam')
    
    regressor.fit(X, Y, epochs=100, verbose=1, batch_size=32)
    

    and the rest of your code unchanged, here is the result:

    Tinker, again and again...

提交回复
热议问题